Advertisement
Guest User

Untitled

a guest
Aug 21st, 2019
91
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 62.17 KB | None | 0 0
  1. "C:\Program Files\Java\jdk1.8.0_191\bin\java.exe" -agentlib:jdwp=transport=dt_socket,address=127.0.0.1:52686,suspend=y,server=n -javaagent:C:\Users\zhaoy\.IntelliJIdea2018.3\system\captureAgent\debugger-agent.jar -Dfile.encoding=UTF-8 -classpath C:\Users\zhaoy\AppData\Local\Temp\classpath61260294.jar com.nari.bdp.mine_server_test_behavior.classification.DNNClassificationTest
  2. Connected to the target VM, address: '127.0.0.1:52686', transport: 'socket'
  3. /C:/Users/zhaoy/IdeaProjects/nari.work/work2/execute_behavior/mine/test_behavior/target/classes/test_xml/bdp/regression/DNNClassfication.xml
  4. [INFO ] 2019-08-09 16:01:50 [main] c.n.b.m.e.service.MineJobExecutor - =========开始执行=========
  5. ########### className = com.nari.bdp.mine.cloud.io.readdatabank.behavior.ReadDataBankBehavior
  6. ########### className = com.nari.bdp.mine.cloud.dp.setrole.behavior.SetroleBehavior
  7. ########### className = com.nari.bdp.mine.cloud.classification.classificationdnn.behavior.ClassificationdnnBehavior
  8. cost time: 1.035000s
  9. [INFO ] 2019-08-09 16:01:50 [main] c.n.b.m.c.i.r.b.ReadDataBankBehavior - ===========开始执行 com.nari.bdp.mine.cloud.io.readdatabank.behavior.ReadDataBankBehavior节点=============
  10. 文件数据源 cost time: 27.648000s
  11. Temp--static column:pollution,dew,temp,press,wnd_spd,snow,rain used 4.202 seconds
  12. Temp--static column:null,null,null,null,null,null,null,null,null used 0.0 seconds
  13. +--------------+---------+---+----+-----+-------+-------+----+----+
  14. | date_time|pollution|dew|temp|press|wnd_dir|wnd_spd|snow|rain|
  15. +--------------+---------+---+----+-----+-------+-------+----+----+
  16. | 2014/1/1 0:00| 24|-20| 7| 1014| NW| 143.48| 0| 0|
  17. | 2014/1/1 1:00| 53|-20| 7| 1013| NW| 147.5| 0| 0|
  18. | 2014/1/1 2:00| 65|-20| 6| 1013| NW| 151.52| 0| 0|
  19. | 2014/1/1 3:00| 70|-20| 6| 1013| NW| 153.31| 0| 0|
  20. | 2014/1/1 4:00| 79|-18| 3| 1012| cv| 0.89| 0| 0|
  21. | 2014/1/1 5:00| 92|-18| 4| 1012| NW| 4.02| 0| 0|
  22. | 2014/1/1 6:00| 106|-19| 6| 1012| NW| 8.94| 0| 0|
  23. | 2014/1/1 7:00| 75|-19| 6| 1013| NW| 16.09| 0| 0|
  24. | 2014/1/1 8:00| 58|-18| 6| 1013| NW| 21.9| 0| 0|
  25. | 2014/1/1 9:00| 33|-18| 7| 1014| NW| 26.82| 0| 0|
  26. |2014/1/1 10:00| 51|-18| 8| 1015| NW| 31.74| 0| 0|
  27. |2014/1/1 11:00| 32|-18| 9| 1015| NW| 39.79| 0| 0|
  28. |2014/1/1 12:00| 23|-17| 10| 1015| NW| 48.73| 0| 0|
  29. |2014/1/1 13:00| 28|-18| 11| 1014| NW| 55.88| 0| 0|
  30. |2014/1/1 14:00| 23|-17| 11| 1014| NW| 63.93| 0| 0|
  31. |2014/1/1 15:00| 24|-17| 11| 1014| NW| 71.08| 0| 0|
  32. |2014/1/1 16:00| 26|-17| 11| 1014| NW| 76.89| 0| 0|
  33. |2014/1/1 17:00| 26|-16| 10| 1015| NW| 81.81| 0| 0|
  34. |2014/1/1 18:00| 27|-16| 9| 1015| NW| 84.94| 0| 0|
  35. |2014/1/1 19:00| 43|-16| 9| 1016| SE| 1.79| 0| 0|
  36. +--------------+---------+---+----+-----+-------+-------+----+----+
  37. only showing top 20 rows
  38.  
  39. 设置角色 cost time: 0.569000s
  40. +--------------+---------+---+----+-----+-------+-------+----+----+
  41. | date_time|pollution|dew|temp|press|wnd_dir|wnd_spd|snow|rain|
  42. +--------------+---------+---+----+-----+-------+-------+----+----+
  43. | 2014/1/1 0:00| 24|-20| 7| 1014| NW| 143.48| 0| 0|
  44. | 2014/1/1 1:00| 53|-20| 7| 1013| NW| 147.5| 0| 0|
  45. | 2014/1/1 2:00| 65|-20| 6| 1013| NW| 151.52| 0| 0|
  46. | 2014/1/1 3:00| 70|-20| 6| 1013| NW| 153.31| 0| 0|
  47. | 2014/1/1 4:00| 79|-18| 3| 1012| cv| 0.89| 0| 0|
  48. | 2014/1/1 5:00| 92|-18| 4| 1012| NW| 4.02| 0| 0|
  49. | 2014/1/1 6:00| 106|-19| 6| 1012| NW| 8.94| 0| 0|
  50. | 2014/1/1 7:00| 75|-19| 6| 1013| NW| 16.09| 0| 0|
  51. | 2014/1/1 8:00| 58|-18| 6| 1013| NW| 21.9| 0| 0|
  52. | 2014/1/1 9:00| 33|-18| 7| 1014| NW| 26.82| 0| 0|
  53. |2014/1/1 10:00| 51|-18| 8| 1015| NW| 31.74| 0| 0|
  54. |2014/1/1 11:00| 32|-18| 9| 1015| NW| 39.79| 0| 0|
  55. |2014/1/1 12:00| 23|-17| 10| 1015| NW| 48.73| 0| 0|
  56. |2014/1/1 13:00| 28|-18| 11| 1014| NW| 55.88| 0| 0|
  57. |2014/1/1 14:00| 23|-17| 11| 1014| NW| 63.93| 0| 0|
  58. |2014/1/1 15:00| 24|-17| 11| 1014| NW| 71.08| 0| 0|
  59. |2014/1/1 16:00| 26|-17| 11| 1014| NW| 76.89| 0| 0|
  60. |2014/1/1 17:00| 26|-16| 10| 1015| NW| 81.81| 0| 0|
  61. |2014/1/1 18:00| 27|-16| 9| 1015| NW| 84.94| 0| 0|
  62. |2014/1/1 19:00| 43|-16| 9| 1016| SE| 1.79| 0| 0|
  63. |2014/1/1 20:00| 62|-14| 3| 1017| SE| 2.68| 0| 0|
  64. |2014/1/1 21:00| 70|-14| 0| 1017| NE| 1.79| 0| 0|
  65. |2014/1/1 22:00| 81|-12| -1| 1018| SE| 0.89| 0| 0|
  66. |2014/1/1 23:00| 111|-12| 0| 1019| NW| 1.79| 0| 0|
  67. | 2014/1/2 0:00| 144|-13| -2| 1019| NE| 0.89| 0| 0|
  68. | 2014/1/2 1:00| 170|-12| -4| 1019| cv| 0.89| 0| 0|
  69. | 2014/1/2 2:00| 174|-12| -4| 1019| cv| 1.34| 0| 0|
  70. | 2014/1/2 3:00| 174|-12| -4| 1019| NW| 0.89| 0| 0|
  71. | 2014/1/2 4:00| 172|-12| -5| 1020| cv| 0.89| 0| 0|
  72. | 2014/1/2 5:00| 149|-10| -2| 1020| SE| 1.79| 0| 0|
  73. | 2014/1/2 6:00| 166| -7| -2| 1020| SE| 3.58| 0| 0|
  74. | 2014/1/2 7:00| 187| -9| -5| 1020| NE| 0.89| 0| 0|
  75. | 2014/1/2 8:00| 107| -9| -5| 1021| NW| 3.13| 0| 0|
  76. | 2014/1/2 9:00| 114| -7| -2| 1021| NW| 4.92| 0| 0|
  77. |2014/1/2 10:00| 108| -7| 2| 1021| NW| 6.71| 0| 0|
  78. |2014/1/2 11:00| 102| -8| 4| 1020| cv| 0.89| 0| 0|
  79. |2014/1/2 12:00| 95| -8| 5| 1019| cv| 1.78| 0| 0|
  80. |2014/1/2 13:00| 127| -9| 7| 1017| cv| 2.67| 0| 0|
  81. |2014/1/2 14:00| 125| -9| 7| 1016| cv| 4.46| 0| 0|
  82. |2014/1/2 15:00| 128|-10| 7| 1016| cv| 5.35| 0| 0|
  83. |2014/1/2 16:00| 146|-10| 6| 1016| cv| 5.8| 0| 0|
  84. |2014/1/2 17:00| 165| -9| 4| 1016| cv| 6.69| 0| 0|
  85. |2014/1/2 18:00| 173| -8| 3| 1016| cv| 7.58| 0| 0|
  86. |2014/1/2 19:00| 195| -9| 1| 1016| cv| 8.47| 0| 0|
  87. |2014/1/2 20:00| 239| -8| 0| 1017| cv| 9.36| 0| 0|
  88. |2014/1/2 21:00| 232| -8| 0| 1017| SE| 0.89| 0| 0|
  89. |2014/1/2 22:00| 242| -8| -1| 1017| NE| 1.79| 0| 0|
  90. |2014/1/2 23:00| 269| -7| -1| 1018| NW| 4.02| 0| 0|
  91. | 2014/1/3 0:00| 264| -9| 0| 1018| NW| 7.15| 0| 0|
  92. | 2014/1/3 1:00| 220| -9| 1| 1018| cv| 0.45| 0| 0|
  93. | 2014/1/3 2:00| 146| -9| 0| 1019| NW| 4.02| 0| 0|
  94. | 2014/1/3 3:00| 34| -9| 1| 1019| cv| 0.89| 0| 0|
  95. | 2014/1/3 4:00| 34| -9| -2| 1020| NW| 3.13| 0| 0|
  96. | 2014/1/3 5:00| 35| -9| -1| 1020| NW| 7.15| 0| 0|
  97. | 2014/1/3 6:00| 45|-13| 0| 1020| cv| 0.89| 0| 0|
  98. | 2014/1/3 7:00| 43|-13| -1| 1022| NW| 3.13| 0| 0|
  99. | 2014/1/3 8:00| 43|-12| -1| 1023| NW| 7.15| 0| 0|
  100. | 2014/1/3 9:00| 36|-13| 4| 1024| NW| 11.17| 0| 0|
  101. |2014/1/3 10:00| 36|-16| 7| 1025| NW| 14.3| 0| 0|
  102. |2014/1/3 11:00| 23|-16| 8| 1024| NW| 17.43| 0| 0|
  103. |2014/1/3 12:00| 25|-16| 8| 1023| NW| 20.56| 0| 0|
  104. |2014/1/3 13:00| 29|-17| 9| 1022| NW| 22.35| 0| 0|
  105. |2014/1/3 14:00| 26|-18| 9| 1022| NE| 1.79| 0| 0|
  106. |2014/1/3 15:00| 21|-17| 8| 1022| NE| 3.58| 0| 0|
  107. |2014/1/3 16:00| 25|-17| 7| 1022| SE| 1.79| 0| 0|
  108. |2014/1/3 17:00| 31|-17| 5| 1023| SE| 3.58| 0| 0|
  109. |2014/1/3 18:00| 43|-17| 3| 1023| cv| 0.45| 0| 0|
  110. |2014/1/3 19:00| 46|-17| 3| 1024| cv| 1.34| 0| 0|
  111. |2014/1/3 20:00| 50|-16| 1| 1024| cv| 2.23| 0| 0|
  112. |2014/1/3 21:00| 68|-13| -1| 1024| cv| 2.68| 0| 0|
  113. |2014/1/3 22:00| 60|-11| -1| 1025| SE| 1.79| 0| 0|
  114. |2014/1/3 23:00| 103| -8| -1| 1025| SE| 4.92| 0| 0|
  115. | 2014/1/4 0:00| 85| -6| -1| 1026| SE| 6.71| 0| 0|
  116. | 2014/1/4 1:00| 86| -6| -2| 1025| SE| 8.5| 0| 0|
  117. | 2014/1/4 2:00| 89| -7| -2| 1025| cv| 0.89| 0| 0|
  118. | 2014/1/4 3:00| 77| -7| -2| 1025| SE| 1.79| 0| 0|
  119. | 2014/1/4 4:00| 77| -9| -5| 1025| cv| 0.45| 0| 0|
  120. | 2014/1/4 5:00| 75| -9| -5| 1025| cv| 1.34| 0| 0|
  121. | 2014/1/4 6:00| 80| -9| -6| 1025| cv| 1.79| 0| 0|
  122. | 2014/1/4 7:00| 86| -7| -4| 1025| SE| 0.89| 0| 0|
  123. | 2014/1/4 8:00| 95| -7| -3| 1025| cv| 0.89| 0| 0|
  124. | 2014/1/4 9:00| 101| -6| -2| 1025| SE| 0.89| 0| 0|
  125. |2014/1/4 10:00| 132| -6| -1| 1025| SE| 1.78| 0| 0|
  126. |2014/1/4 11:00| 153| -5| -1| 1024| SE| 3.57| 0| 0|
  127. |2014/1/4 12:00| 173| -6| 0| 1023| NE| 1.79| 0| 0|
  128. |2014/1/4 13:00| 178| -7| 1| 1021| cv| 0.89| 0| 0|
  129. |2014/1/4 14:00| 176| -7| 2| 1020| SE| 1.79| 0| 0|
  130. |2014/1/4 15:00| 209| -7| 2| 1020| cv| 0.89| 0| 0|
  131. |2014/1/4 16:00| 219| -7| 2| 1020| NW| 1.79| 0| 0|
  132. |2014/1/4 17:00| 224| -7| 1| 1020| NW| 3.58| 0| 0|
  133. |2014/1/4 18:00| 212| -8| -2| 1020| NW| 6.71| 0| 0|
  134. |2014/1/4 19:00| 221| -8| -3| 1021| NW| 8.5| 0| 0|
  135. |2014/1/4 20:00| 221| -8| -4| 1021| NW| 10.29| 0| 0|
  136. |2014/1/4 21:00| 217| -8| -4| 1021| cv| 0.89| 0| 0|
  137. |2014/1/4 22:00| 203| -8| -4| 1021| cv| 1.78| 0| 0|
  138. |2014/1/4 23:00| 221| -9| -5| 1021| NW| 1.79| 0| 0|
  139. | 2014/1/5 0:00| 192| -8| -4| 1021| NW| 3.58| 0| 0|
  140. | 2014/1/5 1:00| 183|-10| -6| 1022| NW| 6.71| 0| 0|
  141. | 2014/1/5 2:00| 175| -8| -5| 1022| NW| 10.73| 0| 0|
  142. | 2014/1/5 3:00| 177| -9| -6| 1022| NW| 14.75| 0| 0|
  143. +--------------+---------+---+----+-----+-------+-------+----+----+
  144. only showing top 100 rows
  145.  
  146. root
  147. |-- date_time: string (nullable = true)
  148. |-- pollution: integer (nullable = true)
  149. |-- dew: integer (nullable = true)
  150. |-- temp: integer (nullable = true)
  151. |-- press: integer (nullable = true)
  152. |-- wnd_dir: string (nullable = true)
  153. |-- wnd_spd: double (nullable = true)
  154. |-- snow: integer (nullable = true)
  155. |-- rain: integer (nullable = true)
  156. |-- predicition: string (nullable = true)
  157. |-- probability: vector (nullable = false)
  158. |-- prob_SE: double (nullable = true)
  159. |-- prob_NW: double (nullable = true)
  160. |-- prob_cv: double (nullable = true)
  161. |-- prob_NE: double (nullable = true)
  162.  
  163. DNN分类 cost time: 712.532000s
  164. org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 28.0 failed 1 times, most recent failure: Lost task 3.0 in stage 28.0 (TID 79, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  165. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, date_time), StringType), true, false) AS date_time#335
  166. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, pollution), IntegerType) AS pollution#336
  167. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, dew), IntegerType) AS dew#337
  168. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, temp), IntegerType) AS temp#338
  169. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, press), IntegerType) AS press#339
  170. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, wnd_dir), StringType), true, false) AS wnd_dir#340
  171. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, wnd_spd), DoubleType) AS wnd_spd#341
  172. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, snow), IntegerType) AS snow#342
  173. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, rain), IntegerType) AS rain#343
  174. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, label_numeric), DoubleType) AS label_numeric#344
  175. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#345
  176. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS minMaxOutput#346
  177. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, predicition), DoubleType) AS predicition#347
  178. newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS probability#348
  179. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291)
  180. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  181. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  182. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  183. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  184. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
  185. at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
  186. at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
  187. at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216)
  188. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295)
  189. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266)
  190. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  191. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  192. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  193. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  194. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  195. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  196. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  197. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  198. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
  199. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
  200. at org.apache.spark.scheduler.Task.run(Task.scala:109)
  201. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
  202. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  203. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  204. at java.lang.Thread.run(Thread.java:748)
  205. Caused by: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  206. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source)
  207. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source)
  208. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
  209. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288)
  210. ... 25 more
  211.  
  212. Driver stacktrace:
  213. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1599)
  214. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1587)
  215. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1586)
  216. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  217. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
  218. at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1586)
  219. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831)
  220. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831)
  221. at scala.Option.foreach(Option.scala:257)
  222. at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:831)
  223. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1820)
  224. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1769)
  225. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1758)
  226. at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
  227. at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:642)
  228. at org.apache.spark.SparkContext.runJob(SparkContext.scala:2027)
  229. at org.apache.spark.SparkContext.runJob(SparkContext.scala:2048)
  230. at org.apache.spark.SparkContext.runJob(SparkContext.scala:2067)
  231. at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:363)
  232. at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38)
  233. at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3272)
  234. at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
  235. at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
  236. at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3253)
  237. at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
  238. at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3252)
  239. at org.apache.spark.sql.Dataset.head(Dataset.scala:2484)
  240. at org.apache.spark.sql.Dataset.take(Dataset.scala:2698)
  241. at com.nari.bdp.mine.base.utils.MineBaseUtil.convertDfToJavaListDColumns(MineBaseUtil.java:76)
  242. at com.nari.bdp.mine.base.utils.MineBaseUtil.convertDfToJavaList(MineBaseUtil.java:28)
  243. at com.nari.bdp.mine.operator.impl.AutoBehavior.getResuleTable(AutoBehavior.java:646)
  244. at com.nari.bdp.mine.operator.impl.AutoBehavior.getDatasetMeta(AutoBehavior.java:475)
  245. at com.nari.bdp.mine.operator.impl.AutoBehavior.generateDatasetInsight(AutoBehavior.java:499)
  246. at com.nari.bdp.mine.operator.impl.AutoBehavior.generatePortInsight(AutoBehavior.java:523)
  247. at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:612)
  248. at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437)
  249. at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635)
  250. at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437)
  251. at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635)
  252. at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437)
  253. at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635)
  254. at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437)
  255. at com.nari.bdp.mine.operator.impl.ExecutionImpl.executeNode(ExecutionImpl.java:75)
  256. at com.nari.bdp.mine.operator.impl.ProcessImpl.run(ProcessImpl.java:79)
  257. at com.nari.bdp.mine.executor.executor.Executor.rProcessExecute(Executor.java:125)
  258. at com.nari.bdp.mine.executor.executor.Executor.rProcessExecute(Executor.java:111)
  259. at com.nari.bdp.mine.executor.executor.Executor.excute(Executor.java:89)
  260. at com.nari.bdp.mine.executor.service.MineJobExecutor.jobExecutor(MineJobExecutor.java:75)
  261. at com.nari.bdp.mine_server_test_behavior.classification.DNNClassificationTest.main(DNNClassificationTest.java:34)
  262. Caused by: java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  263. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, date_time), StringType), true, false) AS date_time#335
  264. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, pollution), IntegerType) AS pollution#336
  265. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, dew), IntegerType) AS dew#337
  266. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, temp), IntegerType) AS temp#338
  267. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, press), IntegerType) AS press#339
  268. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, wnd_dir), StringType), true, false) AS wnd_dir#340
  269. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, wnd_spd), DoubleType) AS wnd_spd#341
  270. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, snow), IntegerType) AS snow#342
  271. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, rain), IntegerType) AS rain#343
  272. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, label_numeric), DoubleType) AS label_numeric#344
  273. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#345
  274. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS minMaxOutput#346
  275. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, predicition), DoubleType) AS predicition#347
  276. newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS probability#348
  277. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291)
  278. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  279. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  280. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  281. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  282. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
  283. at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
  284. at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
  285. at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216)
  286. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295)
  287. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266)
  288. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  289. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  290. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  291. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  292. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  293. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  294. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  295. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  296. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
  297. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
  298. at org.apache.spark.scheduler.Task.run(Task.scala:109)
  299. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
  300. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  301. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  302. at java.lang.Thread.run(Thread.java:748)
  303. Caused by: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  304. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source)
  305. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source)
  306. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
  307. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288)
  308. ... 25 more
  309. process: org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 28.0 failed 1 times, most recent failure: Lost task 3.0 in stage 28.0 (TID 79, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  310. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, date_time), StringType), true, false) AS date_time#335
  311. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, pollution), IntegerType) AS pollution#336
  312. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, dew), IntegerType) AS dew#337
  313. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, temp), IntegerType) AS temp#338
  314. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, press), IntegerType) AS press#339
  315. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, wnd_dir), StringType), true, false) AS wnd_dir#340
  316. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, wnd_spd), DoubleType) AS wnd_spd#341
  317. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, snow), IntegerType) AS snow#342
  318. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, rain), IntegerType) AS rain#343
  319. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, label_numeric), DoubleType) AS label_numeric#344
  320. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#345
  321. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS minMaxOutput#346
  322. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, predicition), DoubleType) AS predicition#347
  323. newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS probability#348
  324. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291)
  325. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  326. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  327. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  328. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  329. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
  330. at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
  331. at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
  332. at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216)
  333. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295)
  334. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266)
  335. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  336. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  337. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  338. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  339. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  340. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  341. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  342. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  343. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
  344. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
  345. at org.apache.spark.scheduler.Task.run(Task.scala:109)
  346. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
  347. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  348. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  349. at java.lang.Thread.run(Thread.java:748)
  350. Caused by: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  351. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source)
  352. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source)
  353. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
  354. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288)
  355. ... 25 more
  356.  
  357. Driver stacktrace:
  358. process: java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  359. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, date_time), StringType), true, false) AS date_time#335
  360. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, pollution), IntegerType) AS pollution#336
  361. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, dew), IntegerType) AS dew#337
  362. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, temp), IntegerType) AS temp#338
  363. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, press), IntegerType) AS press#339
  364. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, wnd_dir), StringType), true, false) AS wnd_dir#340
  365. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, wnd_spd), DoubleType) AS wnd_spd#341
  366. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, snow), IntegerType) AS snow#342
  367. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, rain), IntegerType) AS rain#343
  368. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, label_numeric), DoubleType) AS label_numeric#344
  369. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#345
  370. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS minMaxOutput#346
  371. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, predicition), DoubleType) AS predicition#347
  372. newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS probability#348
  373. process: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  374. Match Error: Job aborted due to stage failure: Task 3 in stage 28.0 failed 1 times, most recent failure: Lost task 3.0 in stage 28.0 (TID 79, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  375. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, date_time), StringType), true, false) AS date_time#335
  376. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, pollution), IntegerType) AS pollution#336
  377. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, dew), IntegerType) AS dew#337
  378. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, temp), IntegerType) AS temp#338
  379. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, press), IntegerType) AS press#339
  380. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, wnd_dir), StringType), true, false) AS wnd_dir#340
  381. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, wnd_spd), DoubleType) AS wnd_spd#341
  382. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, snow), IntegerType) AS snow#342
  383. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, rain), IntegerType) AS rain#343
  384. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, label_numeric), DoubleType) AS label_numeric#344
  385. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#345
  386. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS minMaxOutput#346
  387. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, predicition), DoubleType) AS predicition#347
  388. newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS probability#348
  389. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291)
  390. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  391. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  392. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  393. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  394. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
  395. at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
  396. at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
  397. at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216)
  398. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295)
  399. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266)
  400. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  401. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  402. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  403. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  404. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  405. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  406. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  407. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  408. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
  409. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
  410. at org.apache.spark.scheduler.Task.run(Task.scala:109)
  411. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
  412. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  413. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  414. at java.lang.Thread.run(Thread.java:748)
  415. Caused by: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  416. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source)
  417. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source)
  418. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
  419. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288)
  420. ... 25 more
  421.  
  422. Driver stacktrace:
  423. 2019-08-09 16:18:50,780 - com.nari.bdp.mine.operator.impl.ExecutionImpl -5 [main] ERROR - Job aborted due to stage failure: Task 3 in stage 28.0 failed 1 times, most recent failure: Lost task 3.0 in stage 28.0 (TID 79, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  424. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, date_time), StringType), true, false) AS date_time#335
  425. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, pollution), IntegerType) AS pollution#336
  426. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, dew), IntegerType) AS dew#337
  427. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, temp), IntegerType) AS temp#338
  428. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, press), IntegerType) AS press#339
  429. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, wnd_dir), StringType), true, false) AS wnd_dir#340
  430. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, wnd_spd), DoubleType) AS wnd_spd#341
  431. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, snow), IntegerType) AS snow#342
  432. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, rain), IntegerType) AS rain#343
  433. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, label_numeric), DoubleType) AS label_numeric#344
  434. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#345
  435. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS minMaxOutput#346
  436. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, predicition), DoubleType) AS predicition#347
  437. newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS probability#348
  438. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291)
  439. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  440. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  441. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  442. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  443. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
  444. at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
  445. at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
  446. at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216)
  447. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295)
  448. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266)
  449. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  450. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  451. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  452. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  453. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  454. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  455. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  456. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  457. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
  458. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
  459. at org.apache.spark.scheduler.Task.run(Task.scala:109)
  460. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
  461. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  462. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  463. at java.lang.Thread.run(Thread.java:748)
  464. Caused by: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  465. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source)
  466. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source)
  467. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
  468. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288)
  469. ... 25 more
  470.  
  471. Driver stacktrace:
  472. org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 28.0 failed 1 times, most recent failure: Lost task 3.0 in stage 28.0 (TID 79, localhost, executor driver): java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  473. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, date_time), StringType), true, false) AS date_time#335
  474. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, pollution), IntegerType) AS pollution#336
  475. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, dew), IntegerType) AS dew#337
  476. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, temp), IntegerType) AS temp#338
  477. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, press), IntegerType) AS press#339
  478. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, wnd_dir), StringType), true, false) AS wnd_dir#340
  479. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, wnd_spd), DoubleType) AS wnd_spd#341
  480. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, snow), IntegerType) AS snow#342
  481. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, rain), IntegerType) AS rain#343
  482. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, label_numeric), DoubleType) AS label_numeric#344
  483. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#345
  484. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS minMaxOutput#346
  485. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, predicition), DoubleType) AS predicition#347
  486. newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS probability#348
  487. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291)
  488. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  489. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  490. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  491. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  492. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
  493. at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
  494. at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
  495. at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216)
  496. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295)
  497. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266)
  498. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  499. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  500. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  501. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  502. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  503. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  504. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  505. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  506. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
  507. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
  508. at org.apache.spark.scheduler.Task.run(Task.scala:109)
  509. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
  510. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  511. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  512. at java.lang.Thread.run(Thread.java:748)
  513. Caused by: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  514. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source)
  515. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source)
  516. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
  517. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288)
  518. ... 25 more
  519.  
  520. Driver stacktrace:
  521. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1599)
  522. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1587)
  523. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1586)
  524. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  525. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
  526. at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1586)
  527. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831)
  528. at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831)
  529. at scala.Option.foreach(Option.scala:257)
  530. at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:831)
  531. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1820)
  532. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1769)
  533. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1758)
  534. at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
  535. at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:642)
  536. at org.apache.spark.SparkContext.runJob(SparkContext.scala:2027)
  537. at org.apache.spark.SparkContext.runJob(SparkContext.scala:2048)
  538. at org.apache.spark.SparkContext.runJob(SparkContext.scala:2067)
  539. at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:363)
  540. at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38)
  541. at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3272)
  542. at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
  543. at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
  544. at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3253)
  545. at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
  546. at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3252)
  547. at org.apache.spark.sql.Dataset.head(Dataset.scala:2484)
  548. at org.apache.spark.sql.Dataset.take(Dataset.scala:2698)
  549. at com.nari.bdp.mine.base.utils.MineBaseUtil.convertDfToJavaListDColumns(MineBaseUtil.java:76)
  550. at com.nari.bdp.mine.base.utils.MineBaseUtil.convertDfToJavaList(MineBaseUtil.java:28)
  551. at com.nari.bdp.mine.operator.impl.AutoBehavior.getResuleTable(AutoBehavior.java:646)
  552. at com.nari.bdp.mine.operator.impl.AutoBehavior.getDatasetMeta(AutoBehavior.java:475)
  553. at com.nari.bdp.mine.operator.impl.AutoBehavior.generateDatasetInsight(AutoBehavior.java:499)
  554. at com.nari.bdp.mine.operator.impl.AutoBehavior.generatePortInsight(AutoBehavior.java:523)
  555. at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:612)
  556. at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437)
  557. at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635)
  558. at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437)
  559. at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635)
  560. at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437)
  561. at com.nari.bdp.mine.operator.impl.AutoBehavior.leave(AutoBehavior.java:635)
  562. at com.nari.bdp.mine.operator.impl.AutoBehavior.executeBehavior(AutoBehavior.java:437)
  563. at com.nari.bdp.mine.operator.impl.ExecutionImpl.executeNode(ExecutionImpl.java:75)
  564. at com.nari.bdp.mine.operator.impl.ProcessImpl.run(ProcessImpl.java:79)
  565. at com.nari.bdp.mine.executor.executor.Executor.rProcessExecute(Executor.java:125)
  566. at com.nari.bdp.mine.executor.executor.Executor.rProcessExecute(Executor.java:111)
  567. at com.nari.bdp.mine.executor.executor.Executor.excute(Executor.java:89)
  568. at com.nari.bdp.mine.executor.service.MineJobExecutor.jobExecutor(MineJobExecutor.java:75)
  569. at com.nari.bdp.mine_server_test_behavior.classification.DNNClassificationTest.main(DNNClassificationTest.java:34)
  570. Caused by: java.lang.RuntimeException: Error while encoding: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  571. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 0, date_time), StringType), true, false) AS date_time#335
  572. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 1, pollution), IntegerType) AS pollution#336
  573. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 2, dew), IntegerType) AS dew#337
  574. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 3, temp), IntegerType) AS temp#338
  575. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 4, press), IntegerType) AS press#339
  576. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else staticinvoke(class org.apache.spark.unsafe.types.UTF8String, StringType, fromString, validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 5, wnd_dir), StringType), true, false) AS wnd_dir#340
  577. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 6, wnd_spd), DoubleType) AS wnd_spd#341
  578. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 7, snow), IntegerType) AS snow#342
  579. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 8, rain), IntegerType) AS rain#343
  580. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 9, label_numeric), DoubleType) AS label_numeric#344
  581. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS featureoutput#345
  582. if (assertnotnull(input[0, org.apache.spark.sql.Row, true]).isNullAt) null else newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS minMaxOutput#346
  583. validateexternaltype(getexternalrowfield(assertnotnull(input[0, org.apache.spark.sql.Row, true]), 12, predicition), DoubleType) AS predicition#347
  584. newInstance(class org.apache.spark.ml.linalg.VectorUDT).serialize AS probability#348
  585. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:291)
  586. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  587. at org.apache.spark.sql.SparkSession$$anonfun$4.apply(SparkSession.scala:589)
  588. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  589. at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
  590. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
  591. at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
  592. at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)
  593. at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216)
  594. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:295)
  595. at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:266)
  596. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  597. at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830)
  598. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  599. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  600. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  601. at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
  602. at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
  603. at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
  604. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
  605. at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
  606. at org.apache.spark.scheduler.Task.run(Task.scala:109)
  607. at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
  608. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  609. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  610. at java.lang.Thread.run(Thread.java:748)
  611. Caused by: java.lang.RuntimeException: org.apache.spark.mllib.linalg.DenseVector is not a valid external type for schema of vector
  612. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.Invoke2$(Unknown Source)
  613. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.writeFields3_5$(Unknown Source)
  614. at org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.apply(Unknown Source)
  615. at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.toRow(ExpressionEncoder.scala:288)
  616. ... 25 more
  617. com.meritdata.tempo.force.exit is:null
  618. Disconnected from the target VM, address: '127.0.0.1:52686', transport: 'socket'
  619.  
  620. Process finished with exit code 1024
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement