Guest User

Untitled

a guest
Apr 19th, 2018
59
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 59.14 KB | None | 0 0
  1. Spark job has below error log
  2.  
  3. [2018-04-19T06:59:58.528Z] oracle.kv.spark.rdd.KVStoreRDD DEBUG: Generated SQL statement: SELECT stringValue,entityId,numericValue,id,booleanValue,paramsHash FROM fm_metrics_0 WHERE (eventTime = 0) AND (paramsHash = 'a2770969-c827-30f2-910f-6179418462df') AND ((id = 'e2668020-3b1f-3b61-8e9b-23d43fa65234') OR (id = '171bd431-7566-3414-ad41-315c46507fbf')) for columns [stringValue,entityId,numericValue,id,booleanValue,paramsHash] and filters [EqualTo(eventTime,0),EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df),Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  4. [2018-04-19T06:59:58.905Z] oracle.kv.spark.rdd.KVStoreRDD DEBUG: Generated SQL statement: SELECT stringValue,entityId,numericValue,id,booleanValue,paramsHash FROM fm_metrics_0 WHERE (eventTime = 0) AND (paramsHash = 'a2770969-c827-30f2-910f-6179418462df') AND ((id = 'e2668020-3b1f-3b61-8e9b-23d43fa65234') OR (id = '171bd431-7566-3414-ad41-315c46507fbf')) for columns [stringValue,entityId,numericValue,id,booleanValue,paramsHash] and filters [EqualTo(eventTime,0),EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df),Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  5. org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  6. TungstenAggregate(key=[], functions=[(count(1),mode=Final,isDistinct=false)], output=[_c0#278,result#275L])
  7. +- TungstenExchange SinglePartition, None
  8. +- TungstenAggregate(key=[], functions=[(count(1),mode=Partial,isDistinct=false)], output=[count#286L])
  9. +- Project
  10. +- SortMergeJoin [entityId#95], [ID#210]
  11. :- Sort [entityId#95 ASC], false, 0
  12. : +- Project [entityId#95]
  13. : +- Filter ((entityMetrics#274[7c2aabcc-474c-36d5-a684-09db26387f3d].numericValue < 8.502857022092718) && (entityMetrics#274[77f273b2-cd5b-3e00-98fe-f8c97cfcd85e].booleanValue = true))
  14. : +- SortBasedAggregate(key=[entityId#95], functions=[(MapMetricResults(id#92,paramsHash#93,numericValue#102,booleanValue#103,stringValue#104),mode=Final,isDistinct=false)], output=[entityId#95,entityMetrics#274])
  15. : +- ConvertToSafe
  16. : +- Sort [entityId#95 ASC], false, 0
  17. : +- TungstenExchange hashpartitioning(entityId#95,200), None
  18. : +- ConvertToUnsafe
  19. : +- SortBasedAggregate(key=[entityId#95], functions=[(MapMetricResults(id#92,paramsHash#93,numericValue#102,booleanValue#103,stringValue#104),mode=Partial,isDistinct=false)], output=[entityId#95,map#277])
  20. : +- ConvertToSafe
  21. : +- Sort [entityId#95 ASC], false, 0
  22. : +- Project [stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93]
  23. : +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  24. : +- Scan oracle.kv.spark.KVStoreAvroRelation@5c1bf28a[stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  25. +- Sort [ID#210 ASC], false, 0
  26. +- TungstenExchange hashpartitioning(ID#210,200), None
  27. +- Project [ID#210]
  28. +- Filter (STATUS#222 = STOPPED)
  29. +- MetricsSparkPlan CommonDatabaseLink|DbaasDataLink|R
  30. +- Scan JDBCRelation(jdbc:oracle:thin:iot/oracle@//database:1521/orcl,(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE),[Lorg.apache.spark.Partition;@e2f9484,{url=jdbc:oracle:thin:iot/oracle@//database:1521/orcl, dbtable=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE), driver=oracle.jdbc.driver.OracleDriver, table-name=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE)})[ID#210,STATUS#222] PushedFilters: [EqualTo(STATUS,STOPPED)]
  31.  
  32. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  33. at org.apache.spark.sql.execution.aggregate.TungstenAggregate.doExecute(TungstenAggregate.scala:80)
  34. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  35. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  36. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  37. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  38. at org.apache.spark.sql.execution.ConvertToSafe.doExecute(rowFormatConverters.scala:56)
  39. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  40. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  41. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  42. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  43. at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:187)
  44. at org.apache.spark.sql.execution.Limit.executeCollect(basicOperators.scala:165)
  45. at org.apache.spark.sql.execution.SparkPlan.executeCollectPublic(SparkPlan.scala:174)
  46. at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$execute$1$1.apply(DataFrame.scala:1499)
  47. at org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$execute$1$1.apply(DataFrame.scala:1499)
  48. at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:56)
  49. at org.apache.spark.sql.DataFrame.withNewExecutionId(DataFrame.scala:2086)
  50. at org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$execute$1(DataFrame.scala:1498)
  51. at org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$collect(DataFrame.scala:1505)
  52. at org.apache.spark.sql.DataFrame$$anonfun$head$1.apply(DataFrame.scala:1375)
  53. at org.apache.spark.sql.DataFrame$$anonfun$head$1.apply(DataFrame.scala:1374)
  54. at org.apache.spark.sql.DataFrame.withCallback(DataFrame.scala:2099)
  55. at org.apache.spark.sql.DataFrame.head(DataFrame.scala:1374)
  56. at org.apache.spark.sql.DataFrame.take(DataFrame.scala:1456)
  57. at org.apache.spark.sql.DataFrame.showString(DataFrame.scala:170)
  58. at org.apache.spark.sql.DataFrame.show(DataFrame.scala:350)
  59. at org.apache.spark.sql.DataFrame.show(DataFrame.scala:311)
  60. at org.apache.spark.sql.DataFrame.show(DataFrame.scala:319)
  61. at com.oracle.iot.apps.common.spark.utils.DynamicProcessorSparkUtils.doCompute(DynamicProcessorSparkUtils.java:90)
  62. at com.oracle.iot.fm.processor.CAS_DynamicProcessor.compute(CAS_DynamicProcessor.java:184)
  63. at com.oracle.iot.fm.processor.CAS_DynamicProcessor.execute(CAS_DynamicProcessor.java:142)
  64. at com.oracle.iot.fm.processor.CAS_DynamicProcessor.execute(CAS_DynamicProcessor.java:46)
  65. at com.oracle.bacs.bootstrap.application.txflow.TxFlowManagerImpl$BatchTxFlowHandler.lambda$execute$0(TxFlowManagerImpl.java:517)
  66. at com.oracle.bacs.bootstrap.scope.AnalyticsProcessorScope.doWithAnalyticsProcessor(AnalyticsProcessorScope.java:89)
  67. at com.oracle.bacs.bootstrap.application.txflow.TxFlowManagerImpl$BatchTxFlowHandler.execute(TxFlowManagerImpl.java:515)
  68. at com.oracle.bacs.bootstrap.application.txflow.TxFlowManagerImpl.execute(TxFlowManagerImpl.java:239)
  69. at com.oracle.bacs.bootstrap.application.rest.BatchTxFlowExecutor.lambda$submitInternal$0(BatchTxFlowExecutor.java:284)
  70. at java.util.concurrent.FutureTask.run(FutureTask.java:266)
  71. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  72. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  73. at java.lang.Thread.run(Thread.java:745)
  74. Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  75. TungstenExchange SinglePartition, None
  76. +- TungstenAggregate(key=[], functions=[(count(1),mode=Partial,isDistinct=false)], output=[count#286L])
  77. +- Project
  78. +- SortMergeJoin [entityId#95], [ID#210]
  79. :- Sort [entityId#95 ASC], false, 0
  80. : +- Project [entityId#95]
  81. : +- Filter ((entityMetrics#274[7c2aabcc-474c-36d5-a684-09db26387f3d].numericValue < 8.502857022092718) && (entityMetrics#274[77f273b2-cd5b-3e00-98fe-f8c97cfcd85e].booleanValue = true))
  82. : +- SortBasedAggregate(key=[entityId#95], functions=[(MapMetricResults(id#92,paramsHash#93,numericValue#102,booleanValue#103,stringValue#104),mode=Final,isDistinct=false)], output=[entityId#95,entityMetrics#274])
  83. : +- ConvertToSafe
  84. : +- Sort [entityId#95 ASC], false, 0
  85. : +- TungstenExchange hashpartitioning(entityId#95,200), None
  86. : +- ConvertToUnsafe
  87. : +- SortBasedAggregate(key=[entityId#95], functions=[(MapMetricResults(id#92,paramsHash#93,numericValue#102,booleanValue#103,stringValue#104),mode=Partial,isDistinct=false)], output=[entityId#95,map#277])
  88. : +- ConvertToSafe
  89. : +- Sort [entityId#95 ASC], false, 0
  90. : +- Project [stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93]
  91. : +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  92. : +- Scan oracle.kv.spark.KVStoreAvroRelation@5c1bf28a[stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  93. +- Sort [ID#210 ASC], false, 0
  94. +- TungstenExchange hashpartitioning(ID#210,200), None
  95. +- Project [ID#210]
  96. +- Filter (STATUS#222 = STOPPED)
  97. +- MetricsSparkPlan CommonDatabaseLink|DbaasDataLink|R
  98. +- Scan JDBCRelation(jdbc:oracle:thin:iot/oracle@//database:1521/orcl,(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE),[Lorg.apache.spark.Partition;@e2f9484,{url=jdbc:oracle:thin:iot/oracle@//database:1521/orcl, dbtable=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE), driver=oracle.jdbc.driver.OracleDriver, table-name=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE)})[ID#210,STATUS#222] PushedFilters: [EqualTo(STATUS,STOPPED)]
  99.  
  100. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  101. at org.apache.spark.sql.execution.Exchange.doExecute(Exchange.scala:247)
  102. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  103. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  104. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  105. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  106. at org.apache.spark.sql.execution.aggregate.TungstenAggregate$$anonfun$doExecute$1.apply(TungstenAggregate.scala:86)
  107. at org.apache.spark.sql.execution.aggregate.TungstenAggregate$$anonfun$doExecute$1.apply(TungstenAggregate.scala:80)
  108. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  109. ... 41 more
  110. Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  111. TungstenAggregate(key=[], functions=[(count(1),mode=Partial,isDistinct=false)], output=[count#286L])
  112. +- Project
  113. +- SortMergeJoin [entityId#95], [ID#210]
  114. :- Sort [entityId#95 ASC], false, 0
  115. : +- Project [entityId#95]
  116. : +- Filter ((entityMetrics#274[7c2aabcc-474c-36d5-a684-09db26387f3d].numericValue < 8.502857022092718) && (entityMetrics#274[77f273b2-cd5b-3e00-98fe-f8c97cfcd85e].booleanValue = true))
  117. : +- SortBasedAggregate(key=[entityId#95], functions=[(MapMetricResults(id#92,paramsHash#93,numericValue#102,booleanValue#103,stringValue#104),mode=Final,isDistinct=false)], output=[entityId#95,entityMetrics#274])
  118. : +- ConvertToSafe
  119. : +- Sort [entityId#95 ASC], false, 0
  120. : +- TungstenExchange hashpartitioning(entityId#95,200), None
  121. : +- ConvertToUnsafe
  122. : +- SortBasedAggregate(key=[entityId#95], functions=[(MapMetricResults(id#92,paramsHash#93,numericValue#102,booleanValue#103,stringValue#104),mode=Partial,isDistinct=false)], output=[entityId#95,map#277])
  123. : +- ConvertToSafe
  124. : +- Sort [entityId#95 ASC], false, 0
  125. : +- Project [stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93]
  126. : +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  127. : +- Scan oracle.kv.spark.KVStoreAvroRelation@5c1bf28a[stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  128. +- Sort [ID#210 ASC], false, 0
  129. +- TungstenExchange hashpartitioning(ID#210,200), None
  130. +- Project [ID#210]
  131. +- Filter (STATUS#222 = STOPPED)
  132. +- MetricsSparkPlan CommonDatabaseLink|DbaasDataLink|R
  133. +- Scan JDBCRelation(jdbc:oracle:thin:iot/oracle@//database:1521/orcl,(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE),[Lorg.apache.spark.Partition;@e2f9484,{url=jdbc:oracle:thin:iot/oracle@//database:1521/orcl, dbtable=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE), driver=oracle.jdbc.driver.OracleDriver, table-name=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE)})[ID#210,STATUS#222] PushedFilters: [EqualTo(STATUS,STOPPED)]
  134.  
  135. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  136. at org.apache.spark.sql.execution.aggregate.TungstenAggregate.doExecute(TungstenAggregate.scala:80)
  137. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  138. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  139. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  140. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  141. at org.apache.spark.sql.execution.Exchange.prepareShuffleDependency(Exchange.scala:164)
  142. at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:254)
  143. at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:248)
  144. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  145. ... 49 more
  146. Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  147. SortBasedAggregate(key=[entityId#95], functions=[(MapMetricResults(id#92,paramsHash#93,numericValue#102,booleanValue#103,stringValue#104),mode=Final,isDistinct=false)], output=[entityId#95,entityMetrics#274])
  148. +- ConvertToSafe
  149. +- Sort [entityId#95 ASC], false, 0
  150. +- TungstenExchange hashpartitioning(entityId#95,200), None
  151. +- ConvertToUnsafe
  152. +- SortBasedAggregate(key=[entityId#95], functions=[(MapMetricResults(id#92,paramsHash#93,numericValue#102,booleanValue#103,stringValue#104),mode=Partial,isDistinct=false)], output=[entityId#95,map#277])
  153. +- ConvertToSafe
  154. +- Sort [entityId#95 ASC], false, 0
  155. +- Project [stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93]
  156. +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  157. +- Scan oracle.kv.spark.KVStoreAvroRelation@5c1bf28a[stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  158.  
  159. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  160. at org.apache.spark.sql.execution.aggregate.SortBasedAggregate.doExecute(SortBasedAggregate.scala:69)
  161. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  162. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  163. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  164. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  165. at org.apache.spark.sql.execution.Filter.doExecute(basicOperators.scala:70)
  166. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  167. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  168. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  169. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  170. at org.apache.spark.sql.execution.Project.doExecute(basicOperators.scala:46)
  171. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  172. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  173. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  174. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  175. at org.apache.spark.sql.execution.Sort.doExecute(Sort.scala:64)
  176. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  177. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  178. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  179. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  180. at org.apache.spark.sql.execution.joins.SortMergeJoin.doExecute(SortMergeJoin.scala:70)
  181. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  182. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  183. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  184. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  185. at org.apache.spark.sql.execution.Project.doExecute(basicOperators.scala:46)
  186. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  187. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  188. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  189. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  190. at org.apache.spark.sql.execution.aggregate.TungstenAggregate$$anonfun$doExecute$1.apply(TungstenAggregate.scala:86)
  191. at org.apache.spark.sql.execution.aggregate.TungstenAggregate$$anonfun$doExecute$1.apply(TungstenAggregate.scala:80)
  192. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  193. ... 58 more
  194. Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  195. TungstenExchange hashpartitioning(entityId#95,200), None
  196. +- ConvertToUnsafe
  197. +- SortBasedAggregate(key=[entityId#95], functions=[(MapMetricResults(id#92,paramsHash#93,numericValue#102,booleanValue#103,stringValue#104),mode=Partial,isDistinct=false)], output=[entityId#95,map#277])
  198. +- ConvertToSafe
  199. +- Sort [entityId#95 ASC], false, 0
  200. +- Project [stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93]
  201. +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  202. +- Scan oracle.kv.spark.KVStoreAvroRelation@5c1bf28a[stringValue#104,entityId#95,numericValue#102,id#92,booleanValue#103,paramsHash#93] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  203.  
  204. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  205. at org.apache.spark.sql.execution.Exchange.doExecute(Exchange.scala:247)
  206. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  207. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  208. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  209. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  210. at org.apache.spark.sql.execution.Sort.doExecute(Sort.scala:64)
  211. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  212. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  213. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  214. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  215. at org.apache.spark.sql.execution.ConvertToSafe.doExecute(rowFormatConverters.scala:56)
  216. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  217. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  218. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  219. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  220. at org.apache.spark.sql.execution.aggregate.SortBasedAggregate$$anonfun$doExecute$1.apply(SortBasedAggregate.scala:72)
  221. at org.apache.spark.sql.execution.aggregate.SortBasedAggregate$$anonfun$doExecute$1.apply(SortBasedAggregate.scala:69)
  222. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  223. ... 91 more
  224. Caused by: java.lang.IllegalArgumentException: Invalid KVStoreConfig. Request timeout: 45,000 ms exceeds socket read timeout: 30,000 ms
  225. at oracle.kv.KVStoreFactory.getStoreInternal(KVStoreFactory.java:159)
  226. at oracle.kv.KVStoreFactory.getStore(KVStoreFactory.java:122)
  227. at oracle.kv.KVStoreFactory.getStore(KVStoreFactory.java:59)
  228. at oracle.kv.spark.utils.ConnectionDef$$anonfun$createConnection$4.apply(ConnectionDef.scala:40)
  229. at oracle.kv.spark.utils.ConnectionDef$$anonfun$createConnection$4.apply(ConnectionDef.scala:40)
  230. at scala.util.Try$.apply(Try.scala:161)
  231. at oracle.kv.spark.utils.ConnectionDef.createConnection(ConnectionDef.scala:40)
  232. at oracle.kv.spark.rdd.KVStoreRDD.computePartitions(KVStoreRDD.scala:155)
  233. at oracle.kv.spark.rdd.KVStoreRDD.getPartitions(KVStoreRDD.scala:152)
  234. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  235. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  236. at scala.Option.getOrElse(Option.scala:120)
  237. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  238. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  239. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  240. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  241. at scala.Option.getOrElse(Option.scala:120)
  242. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  243. at org.apache.spark.bacs.metrics.MetricsRDD.getPartitions(MetricsRDD.scala:33)
  244. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  245. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  246. at scala.Option.getOrElse(Option.scala:120)
  247. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  248. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  249. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  250. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  251. at scala.Option.getOrElse(Option.scala:120)
  252. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  253. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  254. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  255. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  256. at scala.Option.getOrElse(Option.scala:120)
  257. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  258. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  259. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  260. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  261. at scala.Option.getOrElse(Option.scala:120)
  262. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  263. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  264. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  265. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  266. at scala.Option.getOrElse(Option.scala:120)
  267. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  268. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  269. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  270. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  271. at scala.Option.getOrElse(Option.scala:120)
  272. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  273. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  274. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  275. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  276. at scala.Option.getOrElse(Option.scala:120)
  277. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  278. at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:91)
  279. at org.apache.spark.sql.execution.Exchange.prepareShuffleDependency(Exchange.scala:220)
  280. at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:254)
  281. at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:248)
  282. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  283. ... 109 more
  284. [2018-04-19T07:00:00.103Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx969325827_rvxf.user" from Hadoop...
  285. [2018-04-19T07:00:00.104Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx969325827_rvxf.password" from Hadoop...
  286. [2018-04-19T07:00:00.155Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx1950214322_nsx2.user" from Hadoop...
  287. [2018-04-19T07:00:00.156Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx1950214322_nsx2.password" from Hadoop...
  288. [2018-04-19T07:00:00.208Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx1646181608_4f3n.user" from Hadoop...
  289. [2018-04-19T07:00:00.208Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx1646181608_4f3n.password" from Hadoop...
  290. [2018-04-19T07:00:00.273Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx958504679_sxio.user" from Hadoop...
  291. [2018-04-19T07:00:00.273Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx958504679_sxio.password" from Hadoop...
  292. [2018-04-19T07:00:00.342Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.xdbaasx1116648067_l74a.user" from Hadoop...
  293. [2018-04-19T07:00:00.342Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.xdbaasx1116648067_l74a.password" from Hadoop...
  294. [2018-04-19T07:00:00.383Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx2077748762_ak2z.user" from Hadoop...
  295. [2018-04-19T07:00:00.383Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.x_dbaasx2077748762_ak2z.password" from Hadoop...
  296. [2018-04-19T07:00:00.416Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.commondatabaselink_5s5x.user" from Hadoop...
  297. [2018-04-19T07:00:00.417Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.commondatabaselink_5s5x.password" from Hadoop...
  298. [2018-04-19T07:00:00.483Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.commondatabaselink_5s5x.user" from Hadoop...
  299. [2018-04-19T07:00:00.483Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.commondatabaselink_5s5x.password" from Hadoop...
  300. [2018-04-19T07:00:00.546Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.commondatabaselink_5s5x.user" from Hadoop...
  301. [2018-04-19T07:00:00.546Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.commondatabaselink_5s5x.password" from Hadoop...
  302. [2018-04-19T07:00:00.594Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.commondatabaselink_5s5x.user" from Hadoop...
  303. [2018-04-19T07:00:00.594Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.cred.HadoopCredentialService, appId: 0-AD, du: 1, message: Retrieving credential "oracle.iot.tenant.0-ad_0-ad.commondatabaselink_5s5x.password" from Hadoop...
  304. [2018-04-19T07:00:00.861Z] oracle.kv.spark.rdd.KVStoreRDD DEBUG: Generated SQL statement: SELECT entityId,stringValue,numericValue,booleanValue,id,paramsHash FROM fm_metrics_0 WHERE (eventTime = 0) AND (paramsHash = 'a2770969-c827-30f2-910f-6179418462df') AND ((id = 'e2668020-3b1f-3b61-8e9b-23d43fa65234') OR (id = '171bd431-7566-3414-ad41-315c46507fbf')) for columns [entityId,stringValue,numericValue,booleanValue,id,paramsHash] and filters [EqualTo(eventTime,0),EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df),Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  305. org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  306. TungstenAggregate(key=[], functions=[(count(1),mode=Final,isDistinct=false)], output=[_c0#551,result#548L])
  307. +- TungstenExchange SinglePartition, None
  308. +- TungstenAggregate(key=[], functions=[(count(1),mode=Partial,isDistinct=false)], output=[count#555L])
  309. +- Project
  310. +- SortMergeJoin [entityId#368], [ID#483]
  311. :- Sort [entityId#368 ASC], false, 0
  312. : +- Project [entityId#368]
  313. : +- Filter ((entityMetrics#547[7c2aabcc-474c-36d5-a684-09db26387f3d].numericValue < 8.502857022092718) && (entityMetrics#547[77f273b2-cd5b-3e00-98fe-f8c97cfcd85e].booleanValue = true))
  314. : +- SortBasedAggregate(key=[entityId#368], functions=[(MapMetricResults(id#365,paramsHash#366,numericValue#375,booleanValue#376,stringValue#377),mode=Final,isDistinct=false)], output=[entityId#368,entityMetrics#547])
  315. : +- ConvertToSafe
  316. : +- Sort [entityId#368 ASC], false, 0
  317. : +- TungstenExchange hashpartitioning(entityId#368,200), None
  318. : +- ConvertToUnsafe
  319. : +- SortBasedAggregate(key=[entityId#368], functions=[(MapMetricResults(id#365,paramsHash#366,numericValue#375,booleanValue#376,stringValue#377),mode=Partial,isDistinct=false)], output=[entityId#368,map#550])
  320. : +- ConvertToSafe
  321. : +- Sort [entityId#368 ASC], false, 0
  322. : +- Project [entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366]
  323. : +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  324. : +- Scan oracle.kv.spark.KVStoreAvroRelation@453842ac[entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  325. +- Sort [ID#483 ASC], false, 0
  326. +- TungstenExchange hashpartitioning(ID#483,200), None
  327. +- Project [ID#483]
  328. +- Filter (STATUS#495 = STOPPED)
  329. +- MetricsSparkPlan CommonDatabaseLink|DbaasDataLink|R
  330. +- Scan JDBCRelation(jdbc:oracle:thin:iot/oracle@//database:1521/orcl,(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE),[Lorg.apache.spark.Partition;@3f2ca11a,{url=jdbc:oracle:thin:iot/oracle@//database:1521/orcl, dbtable=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE), driver=oracle.jdbc.driver.OracleDriver, table-name=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE)})[ID#483,STATUS#495] PushedFilters: [EqualTo(STATUS,STOPPED)]
  331.  
  332. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  333. at org.apache.spark.sql.execution.aggregate.TungstenAggregate.doExecute(TungstenAggregate.scala:80)
  334. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  335. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  336. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  337. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  338. at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:55)
  339. at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:55)
  340. at org.apache.spark.sql.DataFrame.rdd$lzycompute(DataFrame.scala:1637)
  341. at org.apache.spark.sql.DataFrame.rdd(DataFrame.scala:1634)
  342. at org.apache.spark.sql.DataFrame.toJavaRDD(DataFrame.scala:1648)
  343. at com.oracle.iot.apps.common.spark.utils.DynamicProcessorSparkUtils.saveDQResultsToBOVS(DynamicProcessorSparkUtils.java:134)
  344. at com.oracle.iot.fm.processor.CAS_DynamicProcessor.execute(CAS_DynamicProcessor.java:143)
  345. at com.oracle.iot.fm.processor.CAS_DynamicProcessor.execute(CAS_DynamicProcessor.java:46)
  346. at com.oracle.bacs.bootstrap.application.txflow.TxFlowManagerImpl$BatchTxFlowHandler.lambda$execute$0(TxFlowManagerImpl.java:517)
  347. at com.oracle.bacs.bootstrap.scope.AnalyticsProcessorScope.doWithAnalyticsProcessor(AnalyticsProcessorScope.java:89)
  348. at com.oracle.bacs.bootstrap.application.txflow.TxFlowManagerImpl$BatchTxFlowHandler.execute(TxFlowManagerImpl.java:515)
  349. at com.oracle.bacs.bootstrap.application.txflow.TxFlowManagerImpl.execute(TxFlowManagerImpl.java:239)
  350. at com.oracle.bacs.bootstrap.application.rest.BatchTxFlowExecutor.lambda$submitInternal$0(BatchTxFlowExecutor.java:284)
  351. at java.util.concurrent.FutureTask.run(FutureTask.java:266)
  352. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  353. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  354. at java.lang.Thread.run(Thread.java:745)
  355. Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  356. TungstenExchange SinglePartition, None
  357. +- TungstenAggregate(key=[], functions=[(count(1),mode=Partial,isDistinct=false)], output=[count#555L])
  358. +- Project
  359. +- SortMergeJoin [entityId#368], [ID#483]
  360. :- Sort [entityId#368 ASC], false, 0
  361. : +- Project [entityId#368]
  362. : +- Filter ((entityMetrics#547[7c2aabcc-474c-36d5-a684-09db26387f3d].numericValue < 8.502857022092718) && (entityMetrics#547[77f273b2-cd5b-3e00-98fe-f8c97cfcd85e].booleanValue = true))
  363. : +- SortBasedAggregate(key=[entityId#368], functions=[(MapMetricResults(id#365,paramsHash#366,numericValue#375,booleanValue#376,stringValue#377),mode=Final,isDistinct=false)], output=[entityId#368,entityMetrics#547])
  364. : +- ConvertToSafe
  365. : +- Sort [entityId#368 ASC], false, 0
  366. : +- TungstenExchange hashpartitioning(entityId#368,200), None
  367. : +- ConvertToUnsafe
  368. : +- SortBasedAggregate(key=[entityId#368], functions=[(MapMetricResults(id#365,paramsHash#366,numericValue#375,booleanValue#376,stringValue#377),mode=Partial,isDistinct=false)], output=[entityId#368,map#550])
  369. : +- ConvertToSafe
  370. : +- Sort [entityId#368 ASC], false, 0
  371. : +- Project [entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366]
  372. : +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  373. : +- Scan oracle.kv.spark.KVStoreAvroRelation@453842ac[entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  374. +- Sort [ID#483 ASC], false, 0
  375. +- TungstenExchange hashpartitioning(ID#483,200), None
  376. +- Project [ID#483]
  377. +- Filter (STATUS#495 = STOPPED)
  378. +- MetricsSparkPlan CommonDatabaseLink|DbaasDataLink|R
  379. +- Scan JDBCRelation(jdbc:oracle:thin:iot/oracle@//database:1521/orcl,(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE),[Lorg.apache.spark.Partition;@3f2ca11a,{url=jdbc:oracle:thin:iot/oracle@//database:1521/orcl, dbtable=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE), driver=oracle.jdbc.driver.OracleDriver, table-name=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE)})[ID#483,STATUS#495] PushedFilters: [EqualTo(STATUS,STOPPED)]
  380.  
  381. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  382. at org.apache.spark.sql.execution.Exchange.doExecute(Exchange.scala:247)
  383. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  384. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  385. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  386. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  387. at org.apache.spark.sql.execution.aggregate.TungstenAggregate$$anonfun$doExecute$1.apply(TungstenAggregate.scala:86)
  388. at org.apache.spark.sql.execution.aggregate.TungstenAggregate$$anonfun$doExecute$1.apply(TungstenAggregate.scala:80)
  389. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  390. ... 22 more
  391. Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  392. TungstenAggregate(key=[], functions=[(count(1),mode=Partial,isDistinct=false)], output=[count#555L])
  393. +- Project
  394. +- SortMergeJoin [entityId#368], [ID#483]
  395. :- Sort [entityId#368 ASC], false, 0
  396. : +- Project [entityId#368]
  397. : +- Filter ((entityMetrics#547[7c2aabcc-474c-36d5-a684-09db26387f3d].numericValue < 8.502857022092718) && (entityMetrics#547[77f273b2-cd5b-3e00-98fe-f8c97cfcd85e].booleanValue = true))
  398. : +- SortBasedAggregate(key=[entityId#368], functions=[(MapMetricResults(id#365,paramsHash#366,numericValue#375,booleanValue#376,stringValue#377),mode=Final,isDistinct=false)], output=[entityId#368,entityMetrics#547])
  399. : +- ConvertToSafe
  400. : +- Sort [entityId#368 ASC], false, 0
  401. : +- TungstenExchange hashpartitioning(entityId#368,200), None
  402. : +- ConvertToUnsafe
  403. : +- SortBasedAggregate(key=[entityId#368], functions=[(MapMetricResults(id#365,paramsHash#366,numericValue#375,booleanValue#376,stringValue#377),mode=Partial,isDistinct=false)], output=[entityId#368,map#550])
  404. : +- ConvertToSafe
  405. : +- Sort [entityId#368 ASC], false, 0
  406. : +- Project [entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366]
  407. : +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  408. : +- Scan oracle.kv.spark.KVStoreAvroRelation@453842ac[entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  409. +- Sort [ID#483 ASC], false, 0
  410. +- TungstenExchange hashpartitioning(ID#483,200), None
  411. +- Project [ID#483]
  412. +- Filter (STATUS#495 = STOPPED)
  413. +- MetricsSparkPlan CommonDatabaseLink|DbaasDataLink|R
  414. +- Scan JDBCRelation(jdbc:oracle:thin:iot/oracle@//database:1521/orcl,(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE),[Lorg.apache.spark.Partition;@3f2ca11a,{url=jdbc:oracle:thin:iot/oracle@//database:1521/orcl, dbtable=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE), driver=oracle.jdbc.driver.OracleDriver, table-name=(SELECT id, name, type_id as type, registration_number as registrationNumber, vin, make, model, year, registration_time, last_reported_time, last_modified_time, last_modified_by as lastModifiedBy, status FROM FM_VEHICLE)})[ID#483,STATUS#495] PushedFilters: [EqualTo(STATUS,STOPPED)]
  415.  
  416. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  417. at org.apache.spark.sql.execution.aggregate.TungstenAggregate.doExecute(TungstenAggregate.scala:80)
  418. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  419. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  420. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  421. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  422. at org.apache.spark.sql.execution.Exchange.prepareShuffleDependency(Exchange.scala:164)
  423. at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:254)
  424. at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:248)
  425. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  426. ... 30 more
  427. Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  428. SortBasedAggregate(key=[entityId#368], functions=[(MapMetricResults(id#365,paramsHash#366,numericValue#375,booleanValue#376,stringValue#377),mode=Final,isDistinct=false)], output=[entityId#368,entityMetrics#547])
  429. +- ConvertToSafe
  430. +- Sort [entityId#368 ASC], false, 0
  431. +- TungstenExchange hashpartitioning(entityId#368,200), None
  432. +- ConvertToUnsafe
  433. +- SortBasedAggregate(key=[entityId#368], functions=[(MapMetricResults(id#365,paramsHash#366,numericValue#375,booleanValue#376,stringValue#377),mode=Partial,isDistinct=false)], output=[entityId#368,map#550])
  434. +- ConvertToSafe
  435. +- Sort [entityId#368 ASC], false, 0
  436. +- Project [entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366]
  437. +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  438. +- Scan oracle.kv.spark.KVStoreAvroRelation@453842ac[entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  439.  
  440. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  441. at org.apache.spark.sql.execution.aggregate.SortBasedAggregate.doExecute(SortBasedAggregate.scala:69)
  442. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  443. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  444. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  445. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  446. at org.apache.spark.sql.execution.Filter.doExecute(basicOperators.scala:70)
  447. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  448. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  449. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  450. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  451. at org.apache.spark.sql.execution.Project.doExecute(basicOperators.scala:46)
  452. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  453. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  454. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  455. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  456. at org.apache.spark.sql.execution.Sort.doExecute(Sort.scala:64)
  457. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  458. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  459. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  460. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  461. at org.apache.spark.sql.execution.joins.SortMergeJoin.doExecute(SortMergeJoin.scala:70)
  462. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  463. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  464. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  465. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  466. at org.apache.spark.sql.execution.Project.doExecute(basicOperators.scala:46)
  467. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  468. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  469. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  470. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  471. at org.apache.spark.sql.execution.aggregate.TungstenAggregate$$anonfun$doExecute$1.apply(TungstenAggregate.scala:86)
  472. at org.apache.spark.sql.execution.aggregate.TungstenAggregate$$anonfun$doExecute$1.apply(TungstenAggregate.scala:80)
  473. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  474. ... 39 more
  475. Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
  476. TungstenExchange hashpartitioning(entityId#368,200), None
  477. +- ConvertToUnsafe
  478. +- SortBasedAggregate(key=[entityId#368], functions=[(MapMetricResults(id#365,paramsHash#366,numericValue#375,booleanValue#376,stringValue#377),mode=Partial,isDistinct=false)], output=[entityId#368,map#550])
  479. +- ConvertToSafe
  480. +- Sort [entityId#368 ASC], false, 0
  481. +- Project [entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366]
  482. +- MetricsSparkPlan FM_CommonOutput_NoSql_Link|NoSqlDataLink|R
  483. +- Scan oracle.kv.spark.KVStoreAvroRelation@453842ac[entityId#368,stringValue#377,numericValue#375,booleanValue#376,id#365,paramsHash#366] PushedFilters: [EqualTo(eventTime,0), EqualTo(paramsHash,a2770969-c827-30f2-910f-6179418462df), Or(EqualTo(id,e2668020-3b1f-3b61-8e9b-23d43fa65234),EqualTo(id,171bd431-7566-3414-ad41-315c46507fbf))]
  484.  
  485. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:49)
  486. at org.apache.spark.sql.execution.Exchange.doExecute(Exchange.scala:247)
  487. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  488. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  489. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  490. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  491. at org.apache.spark.sql.execution.Sort.doExecute(Sort.scala:64)
  492. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  493. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  494. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  495. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  496. at org.apache.spark.sql.execution.ConvertToSafe.doExecute(rowFormatConverters.scala:56)
  497. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:132)
  498. at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$5.apply(SparkPlan.scala:130)
  499. at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
  500. at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:130)
  501. at org.apache.spark.sql.execution.aggregate.SortBasedAggregate$$anonfun$doExecute$1.apply(SortBasedAggregate.scala:72)
  502. at org.apache.spark.sql.execution.aggregate.SortBasedAggregate$$anonfun$doExecute$1.apply(SortBasedAggregate.scala:69)
  503. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  504. ... 72 more
  505. Caused by: java.lang.IllegalArgumentException: Invalid KVStoreConfig. Request timeout: 45,000 ms exceeds socket read timeout: 30,000 ms
  506. at oracle.kv.KVStoreFactory.getStoreInternal(KVStoreFactory.java:159)
  507. at oracle.kv.KVStoreFactory.getStore(KVStoreFactory.java:122)
  508. at oracle.kv.KVStoreFactory.getStore(KVStoreFactory.java:59)
  509. at oracle.kv.spark.utils.ConnectionDef$$anonfun$createConnection$4.apply(ConnectionDef.scala:40)
  510. at oracle.kv.spark.utils.ConnectionDef$$anonfun$createConnection$4.apply(ConnectionDef.scala:40)
  511. at scala.util.Try$.apply(Try.scala:161)
  512. at oracle.kv.spark.utils.ConnectionDef.createConnection(ConnectionDef.scala:40)
  513. at oracle.kv.spark.rdd.KVStoreRDD.computePartitions(KVStoreRDD.scala:155)
  514. at oracle.kv.spark.rdd.KVStoreRDD.getPartitions(KVStoreRDD.scala:152)
  515. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  516. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  517. at scala.Option.getOrElse(Option.scala:120)
  518. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  519. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  520. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  521. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  522. at scala.Option.getOrElse(Option.scala:120)
  523. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  524. at org.apache.spark.bacs.metrics.MetricsRDD.getPartitions(MetricsRDD.scala:33)
  525. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  526. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  527. at scala.Option.getOrElse(Option.scala:120)
  528. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  529. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  530. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  531. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  532. at scala.Option.getOrElse(Option.scala:120)
  533. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  534. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  535. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  536. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  537. at scala.Option.getOrElse(Option.scala:120)
  538. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  539. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  540. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  541. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  542. at scala.Option.getOrElse(Option.scala:120)
  543. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  544. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  545. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  546. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  547. at scala.Option.getOrElse(Option.scala:120)
  548. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  549. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  550. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  551. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  552. at scala.Option.getOrElse(Option.scala:120)
  553. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  554. at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
  555. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
  556. at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
  557. at scala.Option.getOrElse(Option.scala:120)
  558. at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
  559. at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:91)
  560. at org.apache.spark.sql.execution.Exchange.prepareShuffleDependency(Exchange.scala:220)
  561. at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:254)
  562. at org.apache.spark.sql.execution.Exchange$$anonfun$doExecute$1.apply(Exchange.scala:248)
  563. at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:48)
  564. ... 90 more
  565. [2018-04-19T07:00:57.488Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.application.txflow.TxFlowManagerImpl, appId: 0-AD, du: 1, message: Stopping analytics processor "CAS_DynamicProcessor"...
  566. [2018-04-19T07:00:57.489Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.application.txflow.TxFlowManagerImpl, appId: 0-AD, du: 1, message: Analytics processor "CAS_DynamicProcessor" stopped.
  567. [2018-04-19T07:00:57.503Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.application.txflow.TxFlowManagerImpl, appId: 0-AD, du: 1, message: Analytics processors stopped.
  568. [2018-04-19T07:00:57.504Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.application.ApplicationManager, appId: 0-AD, du: 1, message: TxFlow manager has been asked to stop
  569. [2018-04-19T07:00:57.504Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.application.ApplicationManager, appId: 0-AD, du: 1, message: Stopping application manager
  570. [2018-04-19T07:00:57.505Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.JettyServerController, appId: 0-AD, du: 1, message: Stopping jetty controller
  571. [2018-04-19T07:00:57.582Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.application.Main, appId: 0-AD, du: 1, message: Application manager run finished
  572. [2018-04-19T07:00:57.590Z] oracle.IoT.ApplicationMonitor.Analytics INFO: className: com.oracle.bacs.bootstrap.metrics.AppMetricsRecorder, appId: 0-AD, du: 1, message: Stopping AppMetricsRecorder
  573. [2018-04-19T07:00:57.609Z] oracle.IoT.ApplicationMonitor.Analytics INFO: className: com.oracle.bacs.bootstrap.metrics.AppMetricsRecorder, appId: 0-AD, du: 1, message: AppMetricsRecorder stopped
  574. [2018-04-19T07:00:57.622Z] oracle.IoT.Bootstrap.Analytics INFO: className: com.oracle.bacs.bootstrap.application.utils.JavaSparkContextFactory, appId: 0-AD, du: 1, message: Closing JavaSparkContext "application_1524118654455_0004".
Add Comment
Please, Sign In to add comment