Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- :clean
- :compileJava NO-SOURCE
- :compileScala
- :processResources NO-SOURCE
- :classes
- :jar
- :distTar
- :distZip
- :scaladoc
- [ant:scaladoc] Element '/home/i71178/projects/jhh/sanctions-check/build/resources/main' does not exist.
- :scaladocJar
- :sourcesJar
- :assemble
- :compileTestJava NO-SOURCE
- :compileTestScala
- :processTestResources
- :testClasses
- :test
- Discovery starting.
- Discovery completed in 197 milliseconds.
- Run starting. Expected test count is: 6
- MatchConfiguration(abc,def,XY_Abc.csv,3.00,List(MatchVariable(FIRSTNAME,first_name,25,1.00,-1.00,0.00), MatchVariable(LASTNAME,lastName,25,1.00,-1.00,0.00), MatchVariable(ZIP,zip_code,0,1.00,-1.00,0.00)))
- 17/04/06 11:29:02 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
- org.apache.spark.SparkContext.<init>(SparkContext.scala:81)
- com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
- com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
- com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
- com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
- org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
- org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
- com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
- org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
- java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
- java.util.concurrent.FutureTask.run(FutureTask.java:266)
- java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- java.lang.Thread.run(Thread.java:745)
- 17/04/06 11:29:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 17/04/06 11:29:03 WARN Utils: Your hostname, slcits-l2222 resolves to a loopback address: 127.0.1.1; using 172.27.5.62 instead (on interface eno1)
- 17/04/06 11:29:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
- 17/04/06 11:29:05 ERROR SparkContext: Error initializing SparkContext.
- akka.actor.InvalidActorNameException: actor name [ExecutorEndpoint] is not unique!
- at akka.actor.dungeon.ChildrenContainer$NormalChildrenContainer.reserve(ChildrenContainer.scala:130)
- at akka.actor.dungeon.Children$class.reserveChild(Children.scala:76)
- at akka.actor.ActorCell.reserveChild(ActorCell.scala:369)
- at akka.actor.dungeon.Children$class.makeChild(Children.scala:201)
- at akka.actor.dungeon.Children$class.attachChild(Children.scala:41)
- at akka.actor.ActorCell.attachChild(ActorCell.scala:369)
- at akka.actor.ActorSystemImpl.actorOf(ActorSystem.scala:553)
- at org.apache.spark.rpc.akka.AkkaRpcEnv.actorRef$lzycompute$1(AkkaRpcEnv.scala:92)
- at org.apache.spark.rpc.akka.AkkaRpcEnv.org$apache$spark$rpc$akka$AkkaRpcEnv$$actorRef$1(AkkaRpcEnv.scala:92)
- at org.apache.spark.rpc.akka.AkkaRpcEnv$$anonfun$setupEndpoint$1.apply(AkkaRpcEnv.scala:148)
- at org.apache.spark.rpc.akka.AkkaRpcEnv$$anonfun$setupEndpoint$1.apply(AkkaRpcEnv.scala:148)
- at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.actorRef$lzycompute(AkkaRpcEnv.scala:281)
- at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.actorRef(AkkaRpcEnv.scala:281)
- at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.hashCode(AkkaRpcEnv.scala:329)
- at java.util.concurrent.ConcurrentHashMap.putVal(ConcurrentHashMap.java:1012)
- at java.util.concurrent.ConcurrentHashMap.put(ConcurrentHashMap.java:1006)
- at org.apache.spark.rpc.akka.AkkaRpcEnv.registerEndpoint(AkkaRpcEnv.scala:73)
- at org.apache.spark.rpc.akka.AkkaRpcEnv.setupEndpoint(AkkaRpcEnv.scala:149)
- at org.apache.spark.executor.Executor.<init>(Executor.scala:89)
- at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:57)
- at org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:119)
- at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:514)
- at com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
- at com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
- at com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
- at com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
- at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
- at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
- at com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
- at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
- at java.util.concurrent.FutureTask.run(FutureTask.java:266)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 17/04/06 11:29:05 ERROR Utils: Uncaught exception in thread ScalaTest-2
- java.lang.NullPointerException
- at org.apache.spark.scheduler.local.LocalBackend.stop(LocalBackend.scala:128)
- at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:439)
- at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1436)
- at org.apache.spark.SparkContext$$anonfun$stop$7.apply$mcV$sp(SparkContext.scala:1715)
- at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
- at org.apache.spark.SparkContext.stop(SparkContext.scala:1714)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
- at com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
- at com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
- at com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
- at com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
- at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
- at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
- at com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
- at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
- at java.util.concurrent.FutureTask.run(FutureTask.java:266)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 17/04/06 11:29:05 ERROR Utils: Uncaught exception in thread ScalaTest-2
- java.lang.NullPointerException
- at org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
- at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1228)
- at org.apache.spark.SparkEnv.stop(SparkEnv.scala:100)
- at org.apache.spark.SparkContext$$anonfun$stop$12.apply$mcV$sp(SparkContext.scala:1740)
- at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1185)
- at org.apache.spark.SparkContext.stop(SparkContext.scala:1739)
- at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
- at com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
- at com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
- at com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
- at com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
- at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
- at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
- at com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
- at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
- at java.util.concurrent.FutureTask.run(FutureTask.java:266)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- 17/04/06 11:29:05 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
- org.apache.spark.SparkContext.<init>(SparkContext.scala:81)
- com.holdenkarau.spark.testing.SharedSparkContext$class.beforeAll(SharedSparkContext.scala:34)
- com.verscend.sanctions.scalatest.SparkIntegrationTest.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$beforeAll(TestBase.scala:18)
- com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:40)
- com.verscend.sanctions.scalatest.SparkIntegrationTest.beforeAll(TestBase.scala:18)
- org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
- org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
- com.verscend.sanctions.scalatest.SparkIntegrationTest.run(TestBase.scala:18)
- org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
- java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
- java.util.concurrent.FutureTask.run(FutureTask.java:266)
- java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- java.lang.Thread.run(Thread.java:745)
- WeightedLevenshteinMatch$Test:
- MatchConfiguration$Test:
- A JSON Config String
- - should yield a MatchConfiguration object (92 milliseconds)
- Functions$Test:> :test
- The levenshtein distance based string matching function
- when given two strings and a matching threshold
- - should return the expected answer (1 second, 799 milliseconds)
- when given a threshold below zero (0)
- - should throw an exception (14 milliseconds)
- Functions$Test:
- There reason text assembler
- when given a sequence of column names and a sequence of column values
- - should return the expected reason text string (1 second, 798 milliseconds)
- SearchNpiData$Test:
- com.verscend.sanctions.search.SearchNpiData$Test *** ABORTED *** (2 seconds, 812 milliseconds)
- akka.actor.InvalidActorNameException: actor name [ExecutorEndpoint] is not unique!
- at akka.actor.dungeon.ChildrenContainer$NormalChildrenContainer.reserve(ChildrenContainer.scala:130)
- at akka.actor.dungeon.Children$class.reserveChild(Children.scala:76)
- at akka.actor.ActorCell.reserveChild(ActorCell.scala:369)
- at akka.actor.dungeon.Children$class.makeChild(Children.scala:201)
- at akka.actor.dungeon.Children$class.attachChild(Children.scala:41)
- at akka.actor.ActorCell.attachChild(ActorCell.scala:369)
- at akka.actor.ActorSystemImpl.actorOf(ActorSystem.scala:553)
- at org.apache.spark.rpc.akka.AkkaRpcEnv.actorRef$lzycompute$1(AkkaRpcEnv.scala:92)
- at org.apache.spark.rpc.akka.AkkaRpcEnv.org$apache$spark$rpc$akka$AkkaRpcEnv$$actorRef$1(AkkaRpcEnv.scala:92)
- at org.apache.spark.rpc.akka.AkkaRpcEnv$$anonfun$setupEndpoint$1.apply(AkkaRpcEnv.scala:148)
- ...
- 17/04/06 11:29:12 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
- 17/04/06 11:29:12 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
- 17/04/06 11:29:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 17/04/06 11:29:19 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
- 17/04/06 11:29:19 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
- 17/04/06 11:29:20 WARN AkkaRpcEndpointRef: Error sending message [message = Heartbeat(driver,[Lscala.Tuple2;@227af724,null)] in 1 attempts
- org.apache.spark.rpc.RpcTimeoutException: Recipient[Actor[akka://sparkDriver/user/HeartbeatReceiver#292687200]] had already been terminated.. This timeout is controlled by spark.rpc.askTimeout
- at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcEnv.scala:214)
- at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:229)
- at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcEnv.scala:225)
- at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
- at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185)
- at scala.util.Try$.apply(Try.scala:161)
- at scala.util.Failure.recover(Try.scala:185)
- at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324)
- at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324)
- at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
- at org.spark-project.guava.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:293)
- at scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:133)
- at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
- at scala.concurrent.impl.Promise$DefaultPromise.scala$concurrent$impl$Promise$DefaultPromise$$dispatchOrAddCallback(Promise.scala:280)
- at scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:270)
- at scala.concurrent.Future$class.recover(Future.scala:324)
- at scala.concurrent.impl.Promise$DefaultPromise.recover(Promise.scala:153)
- at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.ask(AkkaRpcEnv.scala:319)
- at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:100)
- at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:77)
- at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$reportHeartBeat(Executor.scala:452)
- at org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply$mcV$sp(Executor.scala:472)
- at org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:472)
- at org.apache.spark.executor.Executor$$anon$1$$anonfun$run$1.apply(Executor.scala:472)
- at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)
- at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:472)
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
- at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
- at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
- at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
- at java.lang.Thread.run(Thread.java:745)
- Caused by: akka.pattern.AskTimeoutException: Recipient[Actor[akka://sparkDriver/user/HeartbeatReceiver#292687200]] had already been terminated.
- at akka.pattern.AskableActorRef$.ask$extension(AskSupport.scala:132)
- at org.apache.spark.rpc.akka.AkkaRpcEndpointRef.ask(AkkaRpcEnv.scala:307)
- ... 15 more
- A source Dataframetest
- when given a search Dataframe
- MatchConfiguration(testMatch,testMatchDescription,XY_Abc,3.0,List(MatchVariable(FIRST_NAME,firstName,25,1.0,-1.0,0.0), MatchVariable(LAST_NAME,lastName,25,2.0,-2.0,0.0), MatchVariable(ZIP,zipCode,0,1.0,-1.0,0.0)))
- root
- |-- PRVSEQ: string (nullable = true)
- |-- PROV_NUM: string (nullable = true)
- |-- src_reason_text: string (nullable = true)
- |-- sch_reason_text: string (nullable = true)
- |-- src_FIRST_NAME: string (nullable = true)
- |-- src_FIRST_NAME_len: integer (nullable = true)
- |-- sch_firstName: string (nullable = true)
- |-- sch_firstName_len: integer (nullable = true)
- |-- src_FIRST_NAME_sch_firstName_lev_thrsh: string (nullable = false)
- |-- src_FIRST_NAME_sch_firstName_lev_agree_wt: string (nullable = false)
- |-- src_FIRST_NAME_sch_firstName_lev_disagree_wt: string (nullable = false)
- |-- src_FIRST_NAME_sch_firstName_lev_miss_wt: string (nullable = false)
- |-- src_FIRST_NAME_sch_firstName_lev: integer (nullable = true)
- |-- src_FIRST_NAME_sch_firstName_lev_pct: double (nullable = true)
- |-- src_FIRST_NAME_sch_firstName_match_wt: double (nullable = true)
- |-- src_LAST_NAME: string (nullable = true)
- |-- src_LAST_NAME_len: integer (nullable = true)
- |-- sch_lastName: string (nullable = true)
- |-- sch_lastName_len: integer (nullable = true)
- |-- src_LAST_NAME_sch_lastName_lev_thrsh: string (nullable = false)
- |-- src_LAST_NAME_sch_lastName_lev_agree_wt: string (nullable = false)
- |-- src_LAST_NAME_sch_lastName_lev_disagree_wt: string (nullable = false)
- |-- src_LAST_NAME_sch_lastName_lev_miss_wt: string (nullable = false)
- |-- src_LAST_NAME_sch_lastName_lev: integer (nullable = true)
- |-- src_LAST_NAME_sch_lastName_lev_pct: double (nullable = true)
- |-- src_LAST_NAME_sch_lastName_match_wt: double (nullable = true)
- |-- src_ZIP: string (nullable = true)
- |-- src_ZIP_len: integer (nullable = true)
- |-- sch_zipCode: string (nullable = true)
- |-- sch_zipCode_len: integer (nullable = true)
- |-- src_ZIP_sch_zipCode_lev_thrsh: string (nullable = false)
- |-- src_ZIP_sch_zipCode_lev_agree_wt: string (nullable = false)
- |-- src_ZIP_sch_zipCode_lev_disagree_wt: string (nullable = false)
- |-- src_ZIP_sch_zipCode_lev_miss_wt: string (nullable = false)
- |-- src_ZIP_sch_zipCode_lev: integer (nullable = true)
- |-- src_ZIP_sch_zipCode_lev_pct: double (nullable = true)
- |-- src_ZIP_sch_zipCode_match_wt: double (nullable = true)
- |-- match_threshold: double (nullable = false)
- |-- aggregate_match_weights: double (nullable = true)
- - should be transformed into a match Dataframe *** FAILED *** (2 seconds, 482 milliseconds)
- org.apache.spark.SparkException: Job aborted due to stage failure: Task serialization failed: java.lang.NullPointerException
- org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
- org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
- org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)
- org.apache.spark.SparkContext.broadcast(SparkContext.scala:1318)
- org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:861)
- org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:772)
- org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:757)
- org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1463)
- org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1455)
- org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1444)
- org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1280)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1268)
- at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1267)
- at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
- at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
- at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1267)
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:871)
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:772)
- at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:757)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1463)
- ...
- Cause: java.lang.NullPointerException:
- at org.apache.spark.broadcast.TorrentBroadcast.<init>(TorrentBroadcast.scala:80)
- at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34)
- at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:63)
- at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1318)
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitMissingTasks(DAGScheduler.scala:861)
- at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:772)
- at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:757)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1463)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1455)
- at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1444)
- ...
- Run completed in 20 seconds, 680 milliseconds.
- Total number of tests run: 5
- Suites: completed 5, aborted 1
- Tests: succeeded 4, failed 1, canceled 0, ignored 0, pending 0
- *** 1 SUITE ABORTED ***
- *** 1 TEST FAILED ***
- :test FAILED
- FAILURE: Build failed with an exception.
- * What went wrong:
- Execution failed for task ':test'.
- > There were failing tests. See the report at: file:///home/i71178/projects/jhh/sanctions-check/build/reports/tests/test/index.html
- * Try:
- Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
- BUILD FAILED
- Total time: 47.318 secs
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement