Advertisement
Guest User

WARN SparkContext: Another SparkContext is being constructed

a guest
Apr 6th, 2017
389
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 22.79 KB | None | 0 0
  1. :clean
  2. :compileJava NO-SOURCE
  3. :compileScala
  4. :processResources NO-SOURCE
  5. :classes
  6. :jar
  7. :distTar
  8. :distZip
  9. :scaladoc
  10. [ant:scaladoc] Element '/home/i71178/projects/jhh/sanctions-check/build/resources/main' does not exist.
  11. :scaladocJar
  12. :sourcesJar
  13. :assemble
  14. :compileTestJava NO-SOURCE
  15. :compileTestScala
  16. :processTestResources
  17. :testClasses
  18. :test
  19. Discovery starting.
  20. Discovery completed in 197 milliseconds.
  21. Run starting. Expected test count is: 6
  22. MatchConfiguration(abc,def,XY_Abc.csv,3.00,List(MatchVariable(FIRSTNAME,first_name,25,1.00,-1.00,0.00), MatchVariable(LASTNAME,lastName,25,1.00,-1.00,0.00), MatchVariable(ZIP,zip_code,0,1.00,-1.00,0.00)))
  23. 17/04/06 11:29:02 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
  24. org.apache.spark.SparkContext.<init>(SparkContext.scala:81)
  25. com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
  26. com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
  27. com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
  28. com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
  29. org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
  30. org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
  31. com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
  32. org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
  33. java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  34. java.util.concurrent.FutureTask.run(FutureTask.java:266)
  35. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  36. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  37. java.lang.Thread.run(Thread.java:745)
  38. 17/04/06 11:29:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  39. 17/04/06 11:29:03 WARN Utils: Your hostname, slcits-l2222 resolves to a loopback address: 127.0.1.1; using 172.27.5.62 instead (on interface eno1)
  40. 17/04/06 11:29:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
  41. 17/04/06 11:29:05 ERROR SparkContext: Error initializing SparkContext.
  42. akka.actor.InvalidActorNameException: actor name [ExecutorEndpoint] is not unique!
  43. at akka.actor.dungeon.ChildrenContainer$NormalChildrenContainer.reserve(ChildrenContainer.scala:130)
  44. at akka.actor.dungeon.Children$class.reserveChild(Children.scala:76)
  45. at akka.actor.ActorCell.reserveChild(ActorCell.scala:369)
  46. at akka.actor.dungeon.Children$class.makeChild(Children.scala:201)
  47. at akka.actor.dungeon.Children$class.attachChild(Children.scala:41)
  48. at akka.actor.ActorCell.attachChild(ActorCell.scala:369)
  49. at akka.actor.ActorSystemImpl.actorOf(ActorSystem.scala:553)
  50. at org.apache.spark.rpc.akka.AkkaRpcEnv.actorRef$lzycompute$1(AkkaRpcEnv.scala:92)
  51. at org.apache.spark.rpc.akka.AkkaRpcEnv.org$apache$spark$rpc$akka$AkkaRpcEnv$$actorRef$1(AkkaRpcEnv.scala:92)
  52. at org.apache.spark.rpc.akka.AkkaRpcEnv$$anonfun$setupEndpoint$1.apply(AkkaRpcEnv.scala:148)
  53. at org.apache.spark.rpc.akka.AkkaRpcEnv$$anonfun$setupEndpoint$1.apply(AkkaRpcEnv.scala:148)
  54. at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.actorRef$lzycompute(AkkaRpcEnv.scala:281)
  55. at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.actorRef(AkkaRpcEnv.scala:281)
  56. at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.hashCode(AkkaRpcEnv.scala:329)
  57. at java.util.concurrent.ConcurrentHashMap.putVal(ConcurrentHashMap.java:1012)
  58. at java.util.concurrent.ConcurrentHashMap.put(ConcurrentHashMap.java:1006)
  59. at org.apache.spark.rpc.akka.AkkaRpcEnv.registerEndpoint(AkkaRpcEnv.scala:73)
  60. at org.apache.spark.rpc.akka.AkkaRpcEnv.setupEndpoint(AkkaRpcEnv.scala:149)
  61. at org.apache.spark.executor.Executor.<init>(Executor.scala:89)
  62. at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:57)
  63. at org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:119)
  64. at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
  65. at org.apache.spark.SparkContext.<init>(SparkContext.scala:514)
  66. at com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
  67. at com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
  68. at com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
  69. at com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
  70. at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
  71. at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
  72. at com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
  73. at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
  74. at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  75. at java.util.concurrent.FutureTask.run(FutureTask.java:266)
  76. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  77. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  78. at java.lang.Thread.run(Thread.java:745)
  79. 17/04/06 11:29:05 ERROR Utils: Uncaught exception in thread ScalaTest-2
  80. java.lang.NullPointerException
  81. at org.apache.spark.scheduler.local.LocalBackend.stop(LocalBackend.scala:128)
  82. at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:439)
  83. at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1436)
  84. at org.apache.spark.SparkContext$$anonfun$stop$7.apply$mcV$sp(SparkContext.scala:1715)
  85. at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
  86. at org.apache.spark.SparkContext.stop(SparkContext.scala:1714)
  87. at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
  88. at com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
  89. at com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
  90. at com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
  91. at com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
  92. at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
  93. at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
  94. at com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
  95. at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
  96. at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  97. at java.util.concurrent.FutureTask.run(FutureTask.java:266)
  98. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  99. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  100. at java.lang.Thread.run(Thread.java:745)
  101. 17/04/06 11:29:05 ERROR Utils: Uncaught exception in thread ScalaTest-2
  102. java.lang.NullPointerException
  103. at org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
  104. at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1228)
  105. at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100)
  106. at org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1740)
  107. at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
  108. at org.apache.spark.SparkContext.stop(SparkContext.scala:1739)
  109. at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
  110. at com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
  111. at com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
  112. at com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
  113. at com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
  114. at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
  115. at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
  116. at com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
  117. at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
  118. at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  119. at java.util.concurrent.FutureTask.run(FutureTask.java:266)
  120. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  121. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  122. at java.lang.Thread.run(Thread.java:745)
  123. 17/04/06 11:29:05 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
  124. org.apache.spark.SparkContext.<init>(SparkContext.scala:81)
  125. com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
  126. com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
  127. com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
  128. com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
  129. org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
  130. org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
  131. com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
  132. org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
  133. java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  134. java.util.concurrent.FutureTask.run(FutureTask.java:266)
  135. java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  136. java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  137. java.lang.Thread.run(Thread.java:745)
  138. WeightedLevenshteinMatch$Test:
  139. MatchConfiguration$Test:
  140. A JSON Config String
  141. - should yield a MatchConfiguration object (92 milliseconds)
  142. Functions$Test:> :test
  143. The levenshtein distance based string matching function
  144. when given two strings and a matching threshold
  145. - should return the expected answer (1 second, 799 milliseconds)
  146. when given a threshold below zero (0)
  147. - should throw an exception (14 milliseconds)
  148. Functions$Test:
  149. There reason text assembler
  150. when given a sequence of column names and a sequence of column values
  151. - should return the expected reason text string (1 second, 798 milliseconds)
  152. SearchNpiData$Test:
  153. com.verscend.sanctions.search.SearchNpiData$Test *** ABORTED *** (2 seconds, 812 milliseconds)
  154. akka.actor.InvalidActorNameException: actor name [ExecutorEndpoint] is not unique!
  155. at akka.actor.dungeon.ChildrenContainer$NormalChildrenContainer.reserve(ChildrenContainer.scala:130)
  156. at akka.actor.dungeon.Children$class.reserveChild(Children.scala:76)
  157. at akka.actor.ActorCell.reserveChild(ActorCell.scala:369)
  158. at akka.actor.dungeon.Children$class.makeChild(Children.scala:201)
  159. at akka.actor.dungeon.Children$class.attachChild(Children.scala:41)
  160. at akka.actor.ActorCell.attachChild(ActorCell.scala:369)
  161. at akka.actor.ActorSystemImpl.actorOf(ActorSystem.scala:553)
  162. at org.apache.spark.rpc.akka.AkkaRpcEnv.actorRef$lzycompute$1(AkkaRpcEnv.scala:92)
  163. at org.apache.spark.rpc.akka.AkkaRpcEnv.org$apache$spark$rpc$akka$AkkaRpcEnv$$actorRef$1(AkkaRpcEnv.scala:92)
  164. at org.apache.spark.rpc.akka.AkkaRpcEnv$$anonfun$setupEndpoint$1.apply(AkkaRpcEnv.scala:148)
  165. ...
  166. 17/04/06 11:29:12 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
  167. 17/04/06 11:29:12 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
  168. 17/04/06 11:29:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  169. 17/04/06 11:29:19 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
  170. 17/04/06 11:29:19 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
  171. 17/04/06 11:29:20 WARN AkkaRpcEndpointRef: Error sending message [message = Heartbeat(driver,[Lscala.Tuple2;@227af724,null)] in 1 attempts
  172. org.apache.spark.rpc.RpcTimeoutException: Recipient[Actor[akka://sparkDriver/user/HeartbeatReceiver#292687200]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout
  173. at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcEnv.scala:214)
  174. at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:229)
  175. at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:225)
  176. at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
  177. at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185)
  178. at scala.util.Try$.apply(Try.scala:161)
  179. at scala.util.Failure.recover(Try.scala:185)
  180. at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324)
  181. at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324)
  182. at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
  183. at org.spark-project.guava.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:293)
  184. at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:133)
  185. at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
  186. at scala.concurrent.impl.Promise$DefaultPromise.scala$concurrent$impl$Promise$DefaultPromise$$dispatchOrAddCallback(Promise.scala:280)
  187. at scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:270)
  188. at scala.concurrent.Future$class.recover(Future.scala:324)
  189. at scala.concurrent.impl.Promise$DefaultPromise.recover(Promise.scala:153)
  190. at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.ask(AkkaRpcEnv.scala:319)
  191. at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:100)
  192. at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:77)
  193. at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$reportHeartBeat(Executor.scala:452)
  194. at org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply$mcV$sp(Executor.scala:472)
  195. at org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:472)
  196. at org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:472)
  197. at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
  198. at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:472)
  199. at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  200. at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
  201. at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
  202. at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
  203. at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
  204. at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
  205. at java.lang.Thread.run(Thread.java:745)
  206. Caused by: akka.pattern.AskTimeoutException: Recipient[Actor[akka://sparkDriver/user/HeartbeatReceiver#292687200]] had already been terminated.
  207. at akka.pattern.AskableActorRef$.ask$extension(AskSupport.scala:132)
  208. at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.ask(AkkaRpcEnv.scala:307)
  209. ... 15 more
  210. A source Dataframetest
  211. when given a search Dataframe
  212. MatchConfiguration(testMatch,testMatchDescription,XY_Abc,3.0,List(MatchVariable(FIRST_NAME,firstName,25,1.0,-1.0,0.0), MatchVariable(LAST_NAME,lastName,25,2.0,-2.0,0.0), MatchVariable(ZIP,zipCode,0,1.0,-1.0,0.0)))
  213. root
  214. |-- PRVSEQ: string (nullable = true)
  215. |-- PROV_NUM: string (nullable = true)
  216. |-- src_reason_text: string (nullable = true)
  217. |-- sch_reason_text: string (nullable = true)
  218. |-- src_FIRST_NAME: string (nullable = true)
  219. |-- src_FIRST_NAME_len: integer (nullable = true)
  220. |-- sch_firstName: string (nullable = true)
  221. |-- sch_firstName_len: integer (nullable = true)
  222. |-- src_FIRST_NAME_sch_firstName_lev_thrsh: string (nullable = false)
  223. |-- src_FIRST_NAME_sch_firstName_lev_agree_wt: string (nullable = false)
  224. |-- src_FIRST_NAME_sch_firstName_lev_disagree_wt: string (nullable = false)
  225. |-- src_FIRST_NAME_sch_firstName_lev_miss_wt: string (nullable = false)
  226. |-- src_FIRST_NAME_sch_firstName_lev: integer (nullable = true)
  227. |-- src_FIRST_NAME_sch_firstName_lev_pct: double (nullable = true)
  228. |-- src_FIRST_NAME_sch_firstName_match_wt: double (nullable = true)
  229. |-- src_LAST_NAME: string (nullable = true)
  230. |-- src_LAST_NAME_len: integer (nullable = true)
  231. |-- sch_lastName: string (nullable = true)
  232. |-- sch_lastName_len: integer (nullable = true)
  233. |-- src_LAST_NAME_sch_lastName_lev_thrsh: string (nullable = false)
  234. |-- src_LAST_NAME_sch_lastName_lev_agree_wt: string (nullable = false)
  235. |-- src_LAST_NAME_sch_lastName_lev_disagree_wt: string (nullable = false)
  236. |-- src_LAST_NAME_sch_lastName_lev_miss_wt: string (nullable = false)
  237. |-- src_LAST_NAME_sch_lastName_lev: integer (nullable = true)
  238. |-- src_LAST_NAME_sch_lastName_lev_pct: double (nullable = true)
  239. |-- src_LAST_NAME_sch_lastName_match_wt: double (nullable = true)
  240. |-- src_ZIP: string (nullable = true)
  241. |-- src_ZIP_len: integer (nullable = true)
  242. |-- sch_zipCode: string (nullable = true)
  243. |-- sch_zipCode_len: integer (nullable = true)
  244. |-- src_ZIP_sch_zipCode_lev_thrsh: string (nullable = false)
  245. |-- src_ZIP_sch_zipCode_lev_agree_wt: string (nullable = false)
  246. |-- src_ZIP_sch_zipCode_lev_disagree_wt: string (nullable = false)
  247. |-- src_ZIP_sch_zipCode_lev_miss_wt: string (nullable = false)
  248. |-- src_ZIP_sch_zipCode_lev: integer (nullable = true)
  249. |-- src_ZIP_sch_zipCode_lev_pct: double (nullable = true)
  250. |-- src_ZIP_sch_zipCode_match_wt: double (nullable = true)
  251. |-- match_threshold: double (nullable = false)
  252. |-- aggregate_match_weights: double (nullable = true)
  253.  
  254. - should be transformed into a match Dataframe *** FAILED *** (2 seconds, 482 milliseconds)
  255. org.apache.spark.SparkException: Job aborted due to stage failure: Task serialization failed: java.lang.NullPointerException
  256. org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
  257. org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
  258. org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)
  259. org.apache.spark.SparkContext.broadcast(SparkContext.scala:1318)
  260. org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:861)
  261. org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:772)
  262. org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:757)
  263. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1463)
  264. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1455)
  265. org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1444)
  266. org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
  267. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1280)
  268. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1268)
  269. at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1267)
  270. at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  271. at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
  272. at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1267)
  273. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:871)
  274. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:772)
  275. at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:757)
  276. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1463)
  277. ...
  278. Cause: java.lang.NullPointerException:
  279. at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
  280. at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
  281. at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)
  282. at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1318)
  283. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:861)
  284. at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:772)
  285. at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:757)
  286. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1463)
  287. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1455)
  288. at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1444)
  289. ...
  290. Run completed in 20 seconds, 680 milliseconds.
  291. Total number of tests run: 5
  292. Suites: completed 5, aborted 1
  293. Tests: succeeded 4, failed 1, canceled 0, ignored 0, pending 0
  294. *** 1 SUITE ABORTED ***
  295. *** 1 TEST FAILED ***
  296. :test FAILED
  297.  
  298. FAILURE: Build failed with an exception.
  299.  
  300. * What went wrong:
  301. Execution failed for task ':test'.
  302. > There were failing tests. See the report at: file:///home/i71178/projects/jhh/sanctions-check/build/reports/tests/test/index.html
  303.  
  304. * Try:
  305. Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
  306.  
  307. BUILD FAILED
  308.  
  309. Total time: 47.318 secs
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement