Advertisement
Guest User

Log4J output

a guest
Oct 2nd, 2023
79
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 82.81 KB | None | 0 0
  1. 23/10/02 06:47:20 INFO DriverDaemon$: Started Log4j2
  2. 23/10/02 06:47:23 INFO DriverDaemon$: Current JVM Version 1.8.0_372
  3. 23/10/02 06:47:24 INFO DriverDaemon$: ========== driver starting up ==========
  4. 23/10/02 06:47:24 INFO DriverDaemon$: Java: Azul Systems, Inc. 1.8.0_372
  5. 23/10/02 06:47:24 INFO DriverDaemon$: OS: Linux/amd64 5.15.0-1042-azure
  6. 23/10/02 06:47:24 INFO DriverDaemon$: CWD: /databricks/driver
  7. 23/10/02 06:47:24 INFO DriverDaemon$: Mem: Max: 6.3G loaded GCs: PS Scavenge, PS MarkSweep
  8. 23/10/02 06:47:24 INFO DriverDaemon$: Logging multibyte characters: ✓
  9. 23/10/02 06:47:24 INFO DriverDaemon$: 'publicFile.rolling.rewrite' appender in root logger: class org.apache.logging.log4j.core.appender.rewrite.RewriteAppender
  10. 23/10/02 06:47:24 INFO DriverDaemon$: == Modules:
  11. 23/10/02 06:47:24 INFO DynamicLoggingConf: Configured feature flag data source LaunchDarkly
  12. 23/10/02 06:47:24 INFO DynamicLoggingConf: Configured feature flag data source LaunchDarkly
  13. 23/10/02 06:47:24 WARN DynamicLoggingConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
  14. 23/10/02 06:47:24 INFO FeatureFlagRegisterConf: Configured feature flag data source LaunchDarkly
  15. 23/10/02 06:47:24 INFO FeatureFlagRegisterConf: Configured feature flag data source LaunchDarkly
  16. 23/10/02 06:47:24 WARN FeatureFlagRegisterConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
  17. 23/10/02 06:47:25 INFO DriverDaemon$: Starting prometheus metrics log export timer
  18. 23/10/02 06:47:25 INFO DriverConf: Configured feature flag data source LaunchDarkly
  19. 23/10/02 06:47:25 INFO DriverConf: Configured feature flag data source LaunchDarkly
  20. 23/10/02 06:47:25 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
  21. 23/10/02 06:47:25 INFO DriverDaemon$: Loaded JDBC drivers in 120 ms
  22. 23/10/02 06:47:25 INFO DriverDaemon$: Universe Git Hash: 4bfecaa31575b040f75ca7a4a539c6692fdc153c
  23. 23/10/02 06:47:25 INFO DriverDaemon$: Spark Git Hash: c3998998144bf322f53c9b6c4192ee636b4aa1ed
  24. 23/10/02 06:47:25 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.password, old value Some(), new value gb1gQqZ9ZIHS, new value will take effect.
  25. 23/10/02 06:47:25 WARN SparkConfUtils$: Setting the same key twice for spark.databricks.io.directoryCommit.enableLogicalDelete, old value Some(false), new value false, new value will take effect.
  26. 23/10/02 06:47:25 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.path, old value Some(/databricks/keys/jetty_ssl_driver_keystore.jks), new value /databricks/keys/jetty-ssl-driver-keystore.jks, new value will take effect.
  27. 23/10/02 06:47:25 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf
  28. 23/10/02 06:47:25 WARN RunHelpers$: Missing tag isolation client: java.util.NoSuchElementException: key not found: TagDefinition(clientType,The client type for a request, used for isolating resources for the request.,DATA_LABEL_SYSTEM_NOT_SENSITIVE,false,false,List(),UsageLogRedactionConfig(List()))
  29. 23/10/02 06:47:25 INFO DatabricksILoop$: Creating throwaway interpreter
  30. 23/10/02 06:47:25 INFO MetastoreMonitor$: Internal metastore configured
  31. 23/10/02 06:47:25 INFO DataSourceFactory$: DataSource Jdbc URL: jdbc:mariadb://consolidated-australiaeast-prod-metastore-addl-1.mysql.database.azure.com:3306/organization4805034236521897?useSSL=true&sslMode=VERIFY_CA&disableSslHostnameVerification=true&trustServerCertificate=false&serverSslCert=/databricks/common/mysql-ssl-ca-cert.crt
  32. 23/10/02 06:47:25 INFO NestedConnectionMonitor$$anon$1: Configured feature flag data source LaunchDarkly
  33. 23/10/02 06:47:25 INFO NestedConnectionMonitor$$anon$1: Configured feature flag data source LaunchDarkly
  34. 23/10/02 06:47:25 WARN NestedConnectionMonitor$$anon$1: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
  35. 23/10/02 06:47:26 INFO DriverCorral: Creating the driver context
  36. 23/10/02 06:47:26 INFO DatabricksILoop$: Class Server Dir: /local_disk0/tmp/repl/spark-7666683401531837394-987aa9b6-0e13-4dde-8745-8bb756ba9bd3
  37. 23/10/02 06:47:26 INFO HikariDataSource: metastore-monitor - Starting...
  38. 23/10/02 06:47:26 INFO HikariDataSource: metastore-monitor - Start completed.
  39. 23/10/02 06:47:26 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.password, old value Some(), new value gb1gQqZ9ZIHS, new value will take effect.
  40. 23/10/02 06:47:26 WARN SparkConfUtils$: Setting the same key twice for spark.databricks.io.directoryCommit.enableLogicalDelete, old value Some(false), new value false, new value will take effect.
  41. 23/10/02 06:47:26 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.path, old value Some(/databricks/keys/jetty_ssl_driver_keystore.jks), new value /databricks/keys/jetty-ssl-driver-keystore.jks, new value will take effect.
  42. 23/10/02 06:47:26 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf
  43. 23/10/02 06:47:26 INFO SparkContext: Running Spark version 3.4.0
  44. 23/10/02 06:47:26 INFO DatabricksEdgeConfigs: serverlessEnabled : false
  45. 23/10/02 06:47:26 INFO DatabricksEdgeConfigs: perfPackEnabled : true
  46. 23/10/02 06:47:26 INFO DatabricksEdgeConfigs: classicSqlEnabled : true
  47. 23/10/02 06:47:26 INFO HikariDataSource: metastore-monitor - Shutdown initiated...
  48. 23/10/02 06:47:26 INFO HikariDataSource: metastore-monitor - Shutdown completed.
  49. 23/10/02 06:47:26 INFO MetastoreMonitor: Metastore healthcheck successful (connection duration = 1253 milliseconds)
  50. 23/10/02 06:47:27 INFO ResourceUtils: ==============================================================
  51. 23/10/02 06:47:27 INFO ResourceUtils: No custom resources configured for spark.driver.
  52. 23/10/02 06:47:27 INFO ResourceUtils: ==============================================================
  53. 23/10/02 06:47:27 INFO SparkContext: Submitted application: Databricks Shell
  54. 23/10/02 06:47:27 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1821, script: , vendor: , offHeap -> name: offHeap, amount: 5463, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
  55. 23/10/02 06:47:27 INFO ResourceProfile: Limiting resource is cpu
  56. 23/10/02 06:47:27 INFO ResourceProfileManager: Added ResourceProfile id: 0
  57. 23/10/02 06:47:27 INFO SecurityManager: Changing view acls to: root
  58. 23/10/02 06:47:27 INFO SecurityManager: Changing modify acls to: root
  59. 23/10/02 06:47:27 INFO SecurityManager: Changing view acls groups to:
  60. 23/10/02 06:47:27 INFO SecurityManager: Changing modify acls groups to:
  61. 23/10/02 06:47:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
  62. 23/10/02 06:47:27 INFO Utils: Successfully started service 'sparkDriver' on port 44035.
  63. 23/10/02 06:47:27 INFO SparkEnv: Registering MapOutputTracker
  64. 23/10/02 06:47:27 INFO SparkEnv: Registering BlockManagerMaster
  65. 23/10/02 06:47:27 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
  66. 23/10/02 06:47:27 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
  67. 23/10/02 06:47:27 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
  68. 23/10/02 06:47:28 INFO DiskBlockManager: Created local directory at /local_disk0/blockmgr-ae7e8010-d904-42e2-8ca1-21d909588411
  69. 23/10/02 06:47:28 INFO MemoryStore: MemoryStore started with capacity 8.7 GiB
  70. 23/10/02 06:47:28 INFO SparkEnv: Registering OutputCommitCoordinator
  71. 23/10/02 06:47:28 INFO DriverConf: Configured feature flag data source LaunchDarkly
  72. 23/10/02 06:47:28 INFO DriverConf: Configured feature flag data source LaunchDarkly
  73. 23/10/02 06:47:28 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
  74. 23/10/02 06:47:28 INFO SparkContext: Spark configuration:
  75. eventLog.rolloverIntervalSeconds=900
  76. libraryDownload.sleepIntervalSeconds=5
  77. libraryDownload.timeoutSeconds=180
  78. spark.akka.frameSize=256
  79. spark.app.name=Databricks Shell
  80. spark.app.startTime=1696229246446
  81. spark.cleaner.referenceTracking.blocking=false
  82. spark.databricks.acl.client=com.databricks.spark.sql.acl.client.SparkSqlAclClient
  83. spark.databricks.acl.provider=com.databricks.sql.acl.ReflectionBackedAclProvider
  84. spark.databricks.acl.scim.client=com.databricks.spark.sql.acl.client.DriverToWebappScimClient
  85. spark.databricks.automl.serviceEnabled=true
  86. spark.databricks.autotune.maintenance.client.classname=com.databricks.maintenanceautocompute.MACClientImpl
  87. spark.databricks.cloudProvider=Azure
  88. spark.databricks.cloudfetch.hasRegionSupport=true
  89. spark.databricks.cloudfetch.requesterClassName=*********(redacted)
  90. spark.databricks.cluster.profile=singleNode
  91. spark.databricks.clusterSource=UI
  92. spark.databricks.clusterUsageTags.attribute_tag_budget=
  93. spark.databricks.clusterUsageTags.attribute_tag_dust_execution_env=
  94. spark.databricks.clusterUsageTags.attribute_tag_dust_maintainer=
  95. spark.databricks.clusterUsageTags.attribute_tag_dust_suite=
  96. spark.databricks.clusterUsageTags.attribute_tag_service=
  97. spark.databricks.clusterUsageTags.autoTerminationMinutes=30
  98. spark.databricks.clusterUsageTags.azureSubscriptionId=f9fbc78c-f20e-44ba-ad50-a7e15c43a004
  99. spark.databricks.clusterUsageTags.cloudProvider=Azure
  100. spark.databricks.clusterUsageTags.clusterAllTags=[{"key":"ResourceClass","value":"SingleNode"},{"key":"Vendor","value":"Databricks"},{"key":"Creator","value":"[email protected]"},{"key":"ClusterName","value":"Dan Corneanu's Cluster"},{"key":"ClusterId","value":"1002-064418-7lm8zo61"},{"key":"DatabricksEnvironment","value":"workerenv-4805034236521897"}]
  101. spark.databricks.clusterUsageTags.clusterAvailability=ON_DEMAND_AZURE
  102. spark.databricks.clusterUsageTags.clusterCreator=Webapp
  103. spark.databricks.clusterUsageTags.clusterFirstOnDemand=1
  104. spark.databricks.clusterUsageTags.clusterGeneration=0
  105. spark.databricks.clusterUsageTags.clusterId=1002-064418-7lm8zo61
  106. spark.databricks.clusterUsageTags.clusterLogDeliveryEnabled=false
  107. spark.databricks.clusterUsageTags.clusterLogDestination=
  108. spark.databricks.clusterUsageTags.clusterMetastoreAccessType=RDS_DIRECT
  109. spark.databricks.clusterUsageTags.clusterName=Dan Corneanu's Cluster
  110. spark.databricks.clusterUsageTags.clusterNoDriverDaemon=false
  111. spark.databricks.clusterUsageTags.clusterNodeType=Standard_DS3_v2
  112. spark.databricks.clusterUsageTags.clusterNumCustomTags=1
  113. spark.databricks.clusterUsageTags.clusterNumSshKeys=0
  114. spark.databricks.clusterUsageTags.clusterOwnerOrgId=4805034236521897
  115. spark.databricks.clusterUsageTags.clusterOwnerUserId=*********(redacted)
  116. spark.databricks.clusterUsageTags.clusterPinned=false
  117. spark.databricks.clusterUsageTags.clusterPythonVersion=3
  118. spark.databricks.clusterUsageTags.clusterResourceClass=SingleNode
  119. spark.databricks.clusterUsageTags.clusterScalingType=fixed_size
  120. spark.databricks.clusterUsageTags.clusterSizeType=VM_CONTAINER
  121. spark.databricks.clusterUsageTags.clusterSku=STANDARD_SKU
  122. spark.databricks.clusterUsageTags.clusterSpotBidMaxPrice=-1.0
  123. spark.databricks.clusterUsageTags.clusterState=Pending
  124. spark.databricks.clusterUsageTags.clusterStateMessage=Starting Spark
  125. spark.databricks.clusterUsageTags.clusterTargetWorkers=0
  126. spark.databricks.clusterUsageTags.clusterUnityCatalogMode=*********(redacted)
  127. spark.databricks.clusterUsageTags.clusterWorkers=0
  128. spark.databricks.clusterUsageTags.containerType=LXC
  129. spark.databricks.clusterUsageTags.dataPlaneRegion=australiaeast
  130. spark.databricks.clusterUsageTags.driverContainerId=71164545dd3c471999e46f4a80a5e47f
  131. spark.databricks.clusterUsageTags.driverContainerPrivateIp=10.139.64.4
  132. spark.databricks.clusterUsageTags.driverInstanceId=c5cec963d17d46699e2b99f577177881
  133. spark.databricks.clusterUsageTags.driverInstancePrivateIp=10.139.0.4
  134. spark.databricks.clusterUsageTags.driverNodeType=Standard_DS3_v2
  135. spark.databricks.clusterUsageTags.driverPublicDns=20.28.245.184
  136. spark.databricks.clusterUsageTags.effectiveSparkVersion=13.2.x-photon-scala2.12
  137. spark.databricks.clusterUsageTags.enableCredentialPassthrough=*********(redacted)
  138. spark.databricks.clusterUsageTags.enableDfAcls=false
  139. spark.databricks.clusterUsageTags.enableElasticDisk=true
  140. spark.databricks.clusterUsageTags.enableGlueCatalogCredentialPassthrough=*********(redacted)
  141. spark.databricks.clusterUsageTags.enableJdbcAutoStart=true
  142. spark.databricks.clusterUsageTags.enableJobsAutostart=true
  143. spark.databricks.clusterUsageTags.enableLocalDiskEncryption=false
  144. spark.databricks.clusterUsageTags.enableSqlAclsOnly=false
  145. spark.databricks.clusterUsageTags.hailEnabled=false
  146. spark.databricks.clusterUsageTags.ignoreTerminationEventInAlerting=false
  147. spark.databricks.clusterUsageTags.instanceWorkerEnvId=workerenv-4805034236521897
  148. spark.databricks.clusterUsageTags.instanceWorkerEnvNetworkType=default
  149. spark.databricks.clusterUsageTags.isDpCpPrivateLinkEnabled=false
  150. spark.databricks.clusterUsageTags.isIMv2Enabled=true
  151. spark.databricks.clusterUsageTags.isServicePrincipalCluster=false
  152. spark.databricks.clusterUsageTags.isSingleUserCluster=*********(redacted)
  153. spark.databricks.clusterUsageTags.managedResourceGroup=mrg-databricks-dan
  154. spark.databricks.clusterUsageTags.ngrokNpipEnabled=false
  155. spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2=0
  156. spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Abfss=0
  157. spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Dbfs=0
  158. spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2File=0
  159. spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Gcs=0
  160. spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2S3=0
  161. spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Volumes=0
  162. spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Workspace=0
  163. spark.databricks.clusterUsageTags.numPerGlobalInitScriptsV2=0
  164. spark.databricks.clusterUsageTags.orgId=4805034236521897
  165. spark.databricks.clusterUsageTags.privateLinkEnabled=false
  166. spark.databricks.clusterUsageTags.region=australiaeast
  167. spark.databricks.clusterUsageTags.runtimeEngine=PHOTON
  168. spark.databricks.clusterUsageTags.sparkEnvVarContainsBacktick=false
  169. spark.databricks.clusterUsageTags.sparkEnvVarContainsDollarSign=false
  170. spark.databricks.clusterUsageTags.sparkEnvVarContainsDoubleQuotes=false
  171. spark.databricks.clusterUsageTags.sparkEnvVarContainsEscape=false
  172. spark.databricks.clusterUsageTags.sparkEnvVarContainsNewline=false
  173. spark.databricks.clusterUsageTags.sparkEnvVarContainsSingleQuotes=false
  174. spark.databricks.clusterUsageTags.sparkImageLabel=release__13.2.x-snapshot-photon-scala2.12__databricks-universe__13.2.5__4bfecaa__c399899__jenkins__4f6bedc__format-3
  175. spark.databricks.clusterUsageTags.sparkMasterUrlType=*********(redacted)
  176. spark.databricks.clusterUsageTags.sparkVersion=13.2.x-photon-scala2.12
  177. spark.databricks.clusterUsageTags.userId=*********(redacted)
  178. spark.databricks.clusterUsageTags.userProvidedRemoteVolumeCount=*********(redacted)
  179. spark.databricks.clusterUsageTags.userProvidedRemoteVolumeSizeGb=*********(redacted)
  180. spark.databricks.clusterUsageTags.userProvidedRemoteVolumeType=*********(redacted)
  181. spark.databricks.clusterUsageTags.userProvidedSparkVersion=*********(redacted)
  182. spark.databricks.clusterUsageTags.workerEnvironmentId=workerenv-4805034236521897
  183. spark.databricks.credential.aws.secretKey.redactor=*********(redacted)
  184. spark.databricks.credential.redactor=*********(redacted)
  185. spark.databricks.credential.scope.fs.adls.gen2.tokenProviderClassName=*********(redacted)
  186. spark.databricks.credential.scope.fs.gs.auth.access.tokenProviderClassName=*********(redacted)
  187. spark.databricks.credential.scope.fs.impl=*********(redacted)
  188. spark.databricks.credential.scope.fs.s3a.tokenProviderClassName=*********(redacted)
  189. spark.databricks.delta.logStore.crossCloud.fatal=true
  190. spark.databricks.delta.multiClusterWrites.enabled=true
  191. spark.databricks.delta.preview.enabled=true
  192. spark.databricks.driverNfs.clusterWidePythonLibsEnabled=true
  193. spark.databricks.driverNfs.enabled=true
  194. spark.databricks.driverNfs.pathSuffix=.ephemeral_nfs
  195. spark.databricks.driverNodeTypeId=Standard_DS3_v2
  196. spark.databricks.enablePublicDbfsFuse=false
  197. spark.databricks.eventLog.dir=eventlogs
  198. spark.databricks.eventLog.enabled=true
  199. spark.databricks.eventLog.listenerClassName=com.databricks.backend.daemon.driver.DBCEventLoggingListener
  200. spark.databricks.io.directoryCommit.enableLogicalDelete=false
  201. spark.databricks.managedCatalog.clientClassName=com.databricks.managedcatalog.ManagedCatalogClientImpl
  202. spark.databricks.metrics.filesystem_io_metrics=true
  203. spark.databricks.mlflow.autologging.enabled=true
  204. spark.databricks.overrideDefaultCommitProtocol=org.apache.spark.sql.execution.datasources.SQLHadoopMapReduceCommitProtocol
  205. spark.databricks.passthrough.adls.gen2.tokenProviderClassName=*********(redacted)
  206. spark.databricks.passthrough.adls.tokenProviderClassName=*********(redacted)
  207. spark.databricks.passthrough.enabled=true
  208. spark.databricks.passthrough.glue.credentialsProviderFactoryClassName=*********(redacted)
  209. spark.databricks.passthrough.glue.executorServiceFactoryClassName=*********(redacted)
  210. spark.databricks.passthrough.oauth.refresher.impl=*********(redacted)
  211. spark.databricks.passthrough.s3a.threadPoolExecutor.factory.class=com.databricks.backend.daemon.driver.aws.S3APassthroughThreadPoolExecutorFactory
  212. spark.databricks.passthrough.s3a.tokenProviderClassName=*********(redacted)
  213. spark.databricks.preemption.enabled=true
  214. spark.databricks.privateLinkEnabled=false
  215. spark.databricks.redactor=com.databricks.spark.util.DatabricksSparkLogRedactorProxy
  216. spark.databricks.repl.enableClassFileCleanup=true
  217. spark.databricks.secret.envVar.keys.toRedact=*********(redacted)
  218. spark.databricks.secret.sparkConf.keys.toRedact=*********(redacted)
  219. spark.databricks.service.dbutils.repl.backend=com.databricks.dbconnect.ReplDBUtils
  220. spark.databricks.service.dbutils.server.backend=com.databricks.dbconnect.SparkServerDBUtils
  221. spark.databricks.session.share=false
  222. spark.databricks.sparkContextId=7666683401531837394
  223. spark.databricks.sql.configMapperClass=com.databricks.dbsql.config.SqlConfigMapperBridge
  224. spark.databricks.tahoe.logStore.aws.class=com.databricks.tahoe.store.MultiClusterLogStore
  225. spark.databricks.tahoe.logStore.azure.class=com.databricks.tahoe.store.AzureLogStore
  226. spark.databricks.tahoe.logStore.class=com.databricks.tahoe.store.DelegatingLogStore
  227. spark.databricks.tahoe.logStore.gcp.class=com.databricks.tahoe.store.GCPLogStore
  228. spark.databricks.unityCatalog.credentialManager.apiTokenProviderClassName=*********(redacted)
  229. spark.databricks.unityCatalog.credentialManager.tokenRefreshEnabled=*********(redacted)
  230. spark.databricks.unityCatalog.volumes.fuse.server.enabled=true
  231. spark.databricks.workerNodeTypeId=Standard_DS3_v2
  232. spark.databricks.workspaceUrl=*********(redacted)
  233. spark.databricks.wsfs.workspacePrivatePreview=true
  234. spark.databricks.wsfsPublicPreview=true
  235. spark.delta.sharing.profile.provider.class=*********(redacted)
  236. spark.driver.allowMultipleContexts=false
  237. spark.driver.extraJavaOptions=-Djava.net.preferIPv6Addresses=false -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED --add-opens=java.management/sun.management=ALL-UNNAMED -Djdk.reflect.useDirectMethodHandle=false
  238. spark.driver.host=10.139.64.4
  239. spark.driver.maxResultSize=4g
  240. spark.driver.port=44035
  241. spark.driver.tempDirectory=/local_disk0/tmp
  242. spark.eventLog.enabled=false
  243. spark.executor.extraClassPath=/databricks/spark/dbconf/log4j/executor:/databricks/spark/dbconf/jets3t/:/databricks/spark/dbconf/hadoop:/databricks/hive/conf:/databricks/jars/*
  244. spark.executor.extraJavaOptions=-Djava.net.preferIPv6Addresses=false -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED --add-opens=java.management/sun.management=ALL-UNNAMED -Djdk.reflect.useDirectMethodHandle=false -Djava.io.tmpdir=/local_disk0/tmp -XX:ReservedCodeCacheSize=512m -XX:+UseCodeCacheFlushing -XX:PerMethodRecompilationCutoff=-1 -XX:PerBytecodeRecompilationCutoff=-1 -Djava.security.properties=/databricks/spark/dbconf/java/extra.security -XX:-UseContainerSupport -XX:+PrintFlagsFinal -XX:+PrintGCDateStamps -XX:+PrintGCDetails -verbose:gc -Xss4m -Djava.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni -Djavax.xml.datatype.DatatypeFactory=com.sun.org.apache.xerces.internal.jaxp.datatype.DatatypeFactoryImpl -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl -Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl -Djavax.xml.validation.SchemaFactory:http://www.w3.org/2001/XMLSchema=com.sun.org.apache.xerces.internal.jaxp.validation.XMLSchemaFactory -Dorg.xml.sax.driver=com.sun.org.apache.xerces.internal.parsers.SAXParser -Dorg.w3c.dom.DOMImplementationSourceList=com.sun.org.apache.xerces.internal.dom.DOMXSImplementationSourceImpl -Djavax.net.ssl.sessionCacheSize=10000 -Dscala.reflect.runtime.disable.typetag.cache=true -Dcom.google.cloud.spark.bigquery.repackaged.io.netty.tryReflectionSetAccessible=true -Dlog4j2.formatMsgNoLookups=true -verbose:gc -Xloggc:/dev/stdout -verbose:class -XX:+UnlockDiagnosticVMOptions -XX:+LogVMOutput -XX:-DisplayVMOutput -XX:LogFile=/databricks/databricks_vm_pipe -Ddatabricks.vmLog.pipe=/databricks/databricks_vm_pipe -Ddatabricks.serviceName=spark-executor-1
  245. spark.executor.id=driver
  246. spark.executor.memory=1821m
  247. spark.executor.tempDirectory=/local_disk0/tmp
  248. spark.extraListeners=com.databricks.backend.daemon.driver.DBCEventLoggingListener
  249. spark.files.fetchFailure.unRegisterOutputOnHost=true
  250. spark.files.overwrite=true
  251. spark.files.useFetchCache=false
  252. spark.hadoop.databricks.dbfs.client.version=v2
  253. spark.hadoop.databricks.fs.perfMetrics.enable=true
  254. spark.hadoop.databricks.loki.fileStatusCache.abfs.enabled=true
  255. spark.hadoop.databricks.loki.fileStatusCache.gcs.enabled=true
  256. spark.hadoop.databricks.loki.fileStatusCache.s3a.enabled=true
  257. spark.hadoop.databricks.loki.fileSystemCache.enabled=true
  258. spark.hadoop.databricks.s3.create.deleteUnnecessaryFakeDirectories=false
  259. spark.hadoop.databricks.s3.verifyBucketExists.enabled=false
  260. spark.hadoop.databricks.s3commit.client.sslTrustAll=false
  261. spark.hadoop.fs.AbstractFileSystem.gs.impl=shaded.databricks.com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS
  262. spark.hadoop.fs.abfs.impl=com.databricks.common.filesystem.LokiFileSystem
  263. spark.hadoop.fs.abfs.impl.disable.cache=true
  264. spark.hadoop.fs.abfss.impl=com.databricks.common.filesystem.LokiFileSystem
  265. spark.hadoop.fs.abfss.impl.disable.cache=true
  266. spark.hadoop.fs.adl.impl=com.databricks.common.filesystem.LokiFileSystem
  267. spark.hadoop.fs.adl.impl.disable.cache=true
  268. spark.hadoop.fs.azure.authorization.caching.enable=false
  269. spark.hadoop.fs.azure.cache.invalidator.type=com.databricks.encryption.utils.CacheInvalidatorImpl
  270. spark.hadoop.fs.azure.readaheadqueue.depth=0
  271. spark.hadoop.fs.azure.skip.metrics=true
  272. spark.hadoop.fs.azure.user.agent.prefix=*********(redacted)
  273. spark.hadoop.fs.cpfs-abfss.impl=*********(redacted)
  274. spark.hadoop.fs.cpfs-abfss.impl.disable.cache=true
  275. spark.hadoop.fs.cpfs-adl.impl=*********(redacted)
  276. spark.hadoop.fs.cpfs-adl.impl.disable.cache=true
  277. spark.hadoop.fs.cpfs-s3.impl=*********(redacted)
  278. spark.hadoop.fs.cpfs-s3a.impl=*********(redacted)
  279. spark.hadoop.fs.cpfs-s3n.impl=*********(redacted)
  280. spark.hadoop.fs.dbfs.impl=com.databricks.backend.daemon.data.client.DbfsHadoop3
  281. spark.hadoop.fs.dbfsartifacts.impl=com.databricks.backend.daemon.data.client.DBFSV1
  282. spark.hadoop.fs.fcfs-abfs.impl=*********(redacted)
  283. spark.hadoop.fs.fcfs-abfs.impl.disable.cache=true
  284. spark.hadoop.fs.fcfs-abfss.impl=*********(redacted)
  285. spark.hadoop.fs.fcfs-abfss.impl.disable.cache=true
  286. spark.hadoop.fs.fcfs-s3.impl=*********(redacted)
  287. spark.hadoop.fs.fcfs-s3.impl.disable.cache=true
  288. spark.hadoop.fs.fcfs-s3a.impl=*********(redacted)
  289. spark.hadoop.fs.fcfs-s3a.impl.disable.cache=true
  290. spark.hadoop.fs.fcfs-s3n.impl=*********(redacted)
  291. spark.hadoop.fs.fcfs-s3n.impl.disable.cache=true
  292. spark.hadoop.fs.fcfs-wasb.impl=*********(redacted)
  293. spark.hadoop.fs.fcfs-wasb.impl.disable.cache=true
  294. spark.hadoop.fs.fcfs-wasbs.impl=*********(redacted)
  295. spark.hadoop.fs.fcfs-wasbs.impl.disable.cache=true
  296. spark.hadoop.fs.file.impl=com.databricks.backend.daemon.driver.WorkspaceLocalFileSystem
  297. spark.hadoop.fs.gs.impl=com.databricks.common.filesystem.LokiFileSystem
  298. spark.hadoop.fs.gs.impl.disable.cache=true
  299. spark.hadoop.fs.gs.outputstream.upload.chunk.size=16777216
  300. spark.hadoop.fs.idbfs.impl=com.databricks.io.idbfs.IdbfsFileSystem
  301. spark.hadoop.fs.mlflowdbfs.impl=com.databricks.mlflowdbfs.MlflowdbfsFileSystem
  302. spark.hadoop.fs.s3.impl=com.databricks.common.filesystem.LokiFileSystem
  303. spark.hadoop.fs.s3.impl.disable.cache=true
  304. spark.hadoop.fs.s3a.assumed.role.credentials.provider=*********(redacted)
  305. spark.hadoop.fs.s3a.attempts.maximum=10
  306. spark.hadoop.fs.s3a.block.size=67108864
  307. spark.hadoop.fs.s3a.connection.maximum=200
  308. spark.hadoop.fs.s3a.connection.timeout=50000
  309. spark.hadoop.fs.s3a.fast.upload=true
  310. spark.hadoop.fs.s3a.fast.upload.active.blocks=32
  311. spark.hadoop.fs.s3a.fast.upload.default=true
  312. spark.hadoop.fs.s3a.impl=com.databricks.common.filesystem.LokiFileSystem
  313. spark.hadoop.fs.s3a.impl.disable.cache=true
  314. spark.hadoop.fs.s3a.max.total.tasks=1000
  315. spark.hadoop.fs.s3a.multipart.size=10485760
  316. spark.hadoop.fs.s3a.multipart.threshold=104857600
  317. spark.hadoop.fs.s3a.retry.interval=250ms
  318. spark.hadoop.fs.s3a.retry.limit=6
  319. spark.hadoop.fs.s3a.retry.throttle.interval=500ms
  320. spark.hadoop.fs.s3a.threads.max=136
  321. spark.hadoop.fs.s3n.impl=com.databricks.common.filesystem.LokiFileSystem
  322. spark.hadoop.fs.s3n.impl.disable.cache=true
  323. spark.hadoop.fs.stage.impl=com.databricks.backend.daemon.driver.managedcatalog.PersonalStagingFileSystem
  324. spark.hadoop.fs.stage.impl.disable.cache=true
  325. spark.hadoop.fs.wasb.impl=com.databricks.common.filesystem.LokiFileSystem
  326. spark.hadoop.fs.wasb.impl.disable.cache=true
  327. spark.hadoop.fs.wasbs.impl=com.databricks.common.filesystem.LokiFileSystem
  328. spark.hadoop.fs.wasbs.impl.disable.cache=true
  329. spark.hadoop.hive.hmshandler.retry.attempts=10
  330. spark.hadoop.hive.hmshandler.retry.interval=2000
  331. spark.hadoop.hive.server2.enable.doAs=false
  332. spark.hadoop.hive.server2.idle.operation.timeout=7200000
  333. spark.hadoop.hive.server2.idle.session.timeout=900000
  334. spark.hadoop.hive.server2.keystore.password=*********(redacted)
  335. spark.hadoop.hive.server2.keystore.path=/databricks/keys/jetty-ssl-driver-keystore.jks
  336. spark.hadoop.hive.server2.session.check.interval=60000
  337. spark.hadoop.hive.server2.thrift.http.cookie.auth.enabled=false
  338. spark.hadoop.hive.server2.thrift.http.port=10000
  339. spark.hadoop.hive.server2.transport.mode=http
  340. spark.hadoop.hive.server2.use.SSL=true
  341. spark.hadoop.hive.warehouse.subdir.inherit.perms=false
  342. spark.hadoop.mapred.output.committer.class=com.databricks.backend.daemon.data.client.DirectOutputCommitter
  343. spark.hadoop.mapreduce.fileoutputcommitter.algorithm.version=2
  344. spark.hadoop.parquet.abfs.readahead.optimization.enabled=true
  345. spark.hadoop.parquet.block.size.row.check.max=10
  346. spark.hadoop.parquet.block.size.row.check.min=10
  347. spark.hadoop.parquet.filter.columnindex.enabled=false
  348. spark.hadoop.parquet.memory.pool.ratio=0.5
  349. spark.hadoop.parquet.page.metadata.validation.enabled=true
  350. spark.hadoop.parquet.page.size.check.estimate=false
  351. spark.hadoop.parquet.page.verify-checksum.enabled=true
  352. spark.hadoop.parquet.page.write-checksum.enabled=true
  353. spark.hadoop.spark.databricks.io.parquet.verifyChecksumOnWrite.enabled=false
  354. spark.hadoop.spark.databricks.io.parquet.verifyChecksumOnWrite.throwsException=false
  355. spark.hadoop.spark.databricks.metrics.filesystem_metrics=true
  356. spark.hadoop.spark.driverproxy.customHeadersToProperties=*********(redacted)
  357. spark.hadoop.spark.hadoop.aws.glue.cache.db.size=1000
  358. spark.hadoop.spark.hadoop.aws.glue.cache.db.ttl-mins=30
  359. spark.hadoop.spark.hadoop.aws.glue.cache.table.size=1000
  360. spark.hadoop.spark.hadoop.aws.glue.cache.table.ttl-mins=30
  361. spark.hadoop.spark.sql.parquet.output.committer.class=org.apache.spark.sql.parquet.DirectParquetOutputCommitter
  362. spark.hadoop.spark.sql.sources.outputCommitterClass=com.databricks.backend.daemon.data.client.MapReduceDirectOutputCommitter
  363. spark.home=/databricks/spark
  364. spark.logConf=true
  365. spark.master=local[*, 4]
  366. spark.memory.offHeap.enabled=true
  367. spark.memory.offHeap.size=5728370688
  368. spark.metrics.conf=/databricks/spark/conf/metrics.properties
  369. spark.r.backendConnectionTimeout=604800
  370. spark.r.numRBackendThreads=1
  371. spark.rdd.compress=true
  372. spark.repl.class.outputDir=/local_disk0/tmp/repl/spark-7666683401531837394-987aa9b6-0e13-4dde-8745-8bb756ba9bd3
  373. spark.rpc.message.maxSize=256
  374. spark.scheduler.listenerbus.eventqueue.capacity=20000
  375. spark.scheduler.mode=FAIR
  376. spark.serializer.objectStreamReset=100
  377. spark.shuffle.manager=SORT
  378. spark.shuffle.memoryFraction=0.2
  379. spark.shuffle.reduceLocality.enabled=false
  380. spark.shuffle.service.enabled=true
  381. spark.shuffle.service.port=4048
  382. spark.sparklyr-backend.threads=1
  383. spark.sparkr.use.daemon=false
  384. spark.speculation=false
  385. spark.speculation.multiplier=3
  386. spark.speculation.quantile=0.9
  387. spark.sql.allowMultipleContexts=false
  388. spark.sql.hive.convertCTAS=true
  389. spark.sql.hive.convertMetastoreParquet=true
  390. spark.sql.hive.metastore.jars=/databricks/databricks-hive/*
  391. spark.sql.hive.metastore.sharedPrefixes=org.mariadb.jdbc,com.mysql.jdbc,org.postgresql,com.microsoft.sqlserver,microsoft.sql.DateTimeOffset,microsoft.sql.Types,com.databricks,com.codahale,com.fasterxml.jackson,shaded.databricks
  392. spark.sql.hive.metastore.version=0.13.0
  393. spark.sql.legacy.createHiveTableByDefault=false
  394. spark.sql.parquet.cacheMetadata=true
  395. spark.sql.parquet.compression.codec=snappy
  396. spark.sql.sources.commitProtocolClass=com.databricks.sql.transaction.directory.DirectoryAtomicCommitProtocol
  397. spark.sql.sources.default=delta
  398. spark.sql.streaming.checkpointFileManagerClass=com.databricks.spark.sql.streaming.DatabricksCheckpointFileManager
  399. spark.sql.streaming.stopTimeout=15s
  400. spark.sql.warehouse.dir=*********(redacted)
  401. spark.storage.blockManagerTimeoutIntervalMs=300000
  402. spark.storage.memoryFraction=0.5
  403. spark.streaming.driver.writeAheadLog.allowBatching=true
  404. spark.streaming.driver.writeAheadLog.closeFileAfterWrite=true
  405. spark.task.reaper.enabled=true
  406. spark.task.reaper.killTimeout=60s
  407. spark.ui.port=40001
  408. spark.ui.prometheus.enabled=true
  409. spark.worker.aioaLazyConfig.dbfsReadinessCheckClientClass=com.databricks.backend.daemon.driver.NephosDbfsReadinessCheckClient
  410. spark.worker.aioaLazyConfig.iamReadinessCheckClientClass=com.databricks.backend.daemon.driver.NephosIamRoleCheckClient
  411. spark.worker.cleanup.enabled=false
  412. 23/10/02 06:47:28 WARN MetricsSystem: Using default name SparkStatusTracker for source because neither spark.metrics.namespace nor spark.app.id is set.
  413. 23/10/02 06:47:28 INFO log: Logging initialized @17266ms to org.eclipse.jetty.util.log.Slf4jLog
  414. 23/10/02 06:47:28 INFO JettyUtils: Start Jetty 10.139.64.4:40001 for SparkUI
  415. 23/10/02 06:47:28 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
  416. 23/10/02 06:47:28 INFO Server: Started @17488ms
  417. 23/10/02 06:47:29 INFO AbstractConnector: Started ServerConnector@29b61bb{HTTP/1.1, (http/1.1)}{10.139.64.4:40001}
  418. 23/10/02 06:47:29 INFO Utils: Successfully started service 'SparkUI' on port 40001.
  419. 23/10/02 06:47:29 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@64b242b3{/,null,AVAILABLE,@Spark}
  420. 23/10/02 06:47:30 INFO DriverPluginContainer: Initialized driver component for plugin org.apache.spark.sql.connect.SparkConnectPlugin.
  421. 23/10/02 06:47:30 INFO DLTDebugger: Registered DLTDebuggerEndpoint at endpoint dlt-debugger
  422. 23/10/02 06:47:31 INFO DriverPluginContainer: Initialized driver component for plugin org.apache.spark.debugger.DLTDebuggerSparkPlugin.
  423. 23/10/02 06:47:31 INFO FairSchedulableBuilder: Fair scheduler configuration not found, created default pool: default, schedulingMode: FAIR, minShare: 0, weight: 1
  424. 23/10/02 06:47:31 INFO Executor: Starting executor ID driver on host 10.139.64.4
  425. 23/10/02 06:47:31 ERROR FuseLogAggregator: 2023/10/02 06:46:56.990553 dbr.ERROR Unable to setup backend: Volumes: &{/Volumes uc-volumes:/Volumes {CseMetaFetching: S3FsImpl: AccessKeyId: SecretAccessKey: SessionTokenId: CannedACL:{cannedAclHeader:} RequesterPays: CredentialsType: CredentialsProvider: StsAssumeRoleArn: StsAssumeRoleExternalId: StsEndpoint: AutoDetectEndpoint: Endpoint: FallbackEndpoint: SseAlgorithm: SseKey: SseCAlgorithm: Cse: CseMaterialProvider: OldAccessKeyId: OldSecretAccessKey: OldStsAssumeRoleArn: OldStsAssumeRoleExternalId: OldStsEndpoint: OldSseKey: OldSseKmsKeyId: OldSseCKey: OldCredentialsCustomClass: AdlsOauth2ClientId: AdlsOauth2RefreshUrl: AdlsOauth2Credential: AdlsOauth2TokenProvider: AdlsOauth2TokenProviderType: AdlsOauth2TokenCustomProvider: AdlsOauth2RefreshToken: AdlsFsImpl: AbfsAuthType: AbfsOauthProvider: AbfsOauth2ClientId: AbfsOauth2ClientSecret: AbfsOauth2ClientEndpoint: AbfsTokenProviderType: AbfsSasToken: GcpCredentialsType: Attributes:map[]}}: Unrecognized storage scheme: uc-volumes
  426. 23/10/02 06:47:31 ERROR FuseLogAggregator: 2023/10/02 06:46:56.990771 dbr.ERROR Unable to setup backend: Volume: &{/Volume dbfs-reserved-path:/uc-volumes-reserved {CseMetaFetching: S3FsImpl: AccessKeyId: SecretAccessKey: SessionTokenId: CannedACL:{cannedAclHeader:} RequesterPays: CredentialsType: CredentialsProvider: StsAssumeRoleArn: StsAssumeRoleExternalId: StsEndpoint: AutoDetectEndpoint: Endpoint: FallbackEndpoint: SseAlgorithm: SseKey: SseCAlgorithm: Cse: CseMaterialProvider: OldAccessKeyId: OldSecretAccessKey: OldStsAssumeRoleArn: OldStsAssumeRoleExternalId: OldStsEndpoint: OldSseKey: OldSseKmsKeyId: OldSseCKey: OldCredentialsCustomClass: AdlsOauth2ClientId: AdlsOauth2RefreshUrl: AdlsOauth2Credential: AdlsOauth2TokenProvider: AdlsOauth2TokenProviderType: AdlsOauth2TokenCustomProvider: AdlsOauth2RefreshToken: AdlsFsImpl: AbfsAuthType: AbfsOauthProvider: AbfsOauth2ClientId: AbfsOauth2ClientSecret: AbfsOauth2ClientEndpoint: AbfsTokenProviderType: AbfsSasToken: GcpCredentialsType: Attributes:map[]}}: Unrecognized storage scheme: dbfs-reserved-path
  427. 23/10/02 06:47:31 ERROR FuseLogAggregator: 2023/10/02 06:46:56.990839 dbr.ERROR Unable to setup backend: volumes: &{/volumes dbfs-reserved-path:/uc-volumes-reserved {CseMetaFetching: S3FsImpl: AccessKeyId: SecretAccessKey: SessionTokenId: CannedACL:{cannedAclHeader:} RequesterPays: CredentialsType: CredentialsProvider: StsAssumeRoleArn: StsAssumeRoleExternalId: StsEndpoint: AutoDetectEndpoint: Endpoint: FallbackEndpoint: SseAlgorithm: SseKey: SseCAlgorithm: Cse: CseMaterialProvider: OldAccessKeyId: OldSecretAccessKey: OldStsAssumeRoleArn: OldStsAssumeRoleExternalId: OldStsEndpoint: OldSseKey: OldSseKmsKeyId: OldSseCKey: OldCredentialsCustomClass: AdlsOauth2ClientId: AdlsOauth2RefreshUrl: AdlsOauth2Credential: AdlsOauth2TokenProvider: AdlsOauth2TokenProviderType: AdlsOauth2TokenCustomProvider: AdlsOauth2RefreshToken: AdlsFsImpl: AbfsAuthType: AbfsOauthProvider: AbfsOauth2ClientId: AbfsOauth2ClientSecret: AbfsOauth2ClientEndpoint: AbfsTokenProviderType: AbfsSasToken: GcpCredentialsType: Attributes:map[]}}: Unrecognized storage scheme: dbfs-reserved-path
  428. 23/10/02 06:47:31 ERROR FuseLogAggregator: 2023/10/02 06:46:56.990899 dbr.ERROR Unable to setup backend: volume: &{/volume dbfs-reserved-path:/uc-volumes-reserved {CseMetaFetching: S3FsImpl: AccessKeyId: SecretAccessKey: SessionTokenId: CannedACL:{cannedAclHeader:} RequesterPays: CredentialsType: CredentialsProvider: StsAssumeRoleArn: StsAssumeRoleExternalId: StsEndpoint: AutoDetectEndpoint: Endpoint: FallbackEndpoint: SseAlgorithm: SseKey: SseCAlgorithm: Cse: CseMaterialProvider: OldAccessKeyId: OldSecretAccessKey: OldStsAssumeRoleArn: OldStsAssumeRoleExternalId: OldStsEndpoint: OldSseKey: OldSseKmsKeyId: OldSseCKey: OldCredentialsCustomClass: AdlsOauth2ClientId: AdlsOauth2RefreshUrl: AdlsOauth2Credential: AdlsOauth2TokenProvider: AdlsOauth2TokenProviderType: AdlsOauth2TokenCustomProvider: AdlsOauth2RefreshToken: AdlsFsImpl: AbfsAuthType: AbfsOauthProvider: AbfsOauth2ClientId: AbfsOauth2ClientSecret: AbfsOauth2ClientEndpoint: AbfsTokenProviderType: AbfsSasToken: GcpCredentialsType: Attributes:map[]}}: Unrecognized storage scheme: dbfs-reserved-path
  429. 23/10/02 06:47:31 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): 'file:/databricks/spark/dbconf/log4j/executor/,file:/databricks/spark/dbconf/jets3t/,file:/databricks/spark/dbconf/hadoop/,file:/databricks/hive/conf/,file:/databricks/jars/*,file:/databricks/driver/conf/,file:/databricks/driver/hadoop,file:/databricks/driver/executor,file:/databricks/driver/*,file:/databricks/driver/jets3t'
  430. 23/10/02 06:47:31 INFO Executor: Using REPL class URI: spark://10.139.64.4:44035/classes
  431. 23/10/02 06:47:31 INFO ExecutorPluginContainer: Initialized executor component for plugin org.apache.spark.debugger.DLTDebuggerSparkPlugin.
  432. 23/10/02 06:47:31 INFO Utils: resolved command to be run: WrappedArray(getconf, PAGESIZE)
  433. 23/10/02 06:47:31 INFO TaskSchedulerImpl: Task preemption enabled.
  434. 23/10/02 06:47:31 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41225.
  435. 23/10/02 06:47:31 INFO NettyBlockTransferService: Server created on 10.139.64.4:41225
  436. 23/10/02 06:47:31 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
  437. 23/10/02 06:47:31 INFO BlockManager: external shuffle service port = 4048
  438. 23/10/02 06:47:31 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.139.64.4, 41225, None)
  439. 23/10/02 06:47:31 INFO BlockManagerMasterEndpoint: Registering block manager 10.139.64.4:41225 with 8.7 GiB RAM, BlockManagerId(driver, 10.139.64.4, 41225, None)
  440. 23/10/02 06:47:31 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.139.64.4, 41225, None)
  441. 23/10/02 06:47:31 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.139.64.4, 41225, None)
  442. 23/10/02 06:47:31 INFO DBCEventLoggingListener: Initializing DBCEventLoggingListener
  443. 23/10/02 06:47:31 INFO DBCEventLoggingListener: Logging events to eventlogs/7666683401531837394/eventlog
  444. 23/10/02 06:47:31 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
  445. 23/10/02 06:47:31 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@24b2b5d3{/,null,AVAILABLE}
  446. 23/10/02 06:47:31 INFO SslContextFactory: x509=X509@7a8b7f2e(1,h=[az-australiaeast.workers.prod.ns.databricks.com],a=[],w=[]) for Server@74d7ebce[provider=null,keyStore=file:///databricks/keys/jetty_ssl_driver_keystore.jks,trustStore=file:///databricks/keys/jetty_ssl_driver_keystore.jks]
  447. 23/10/02 06:47:31 INFO AbstractConnector: Started ServerConnector@34e1a87b{SSL, (ssl, http/1.1)}{0.0.0.0:1023}
  448. 23/10/02 06:47:31 INFO Server: Started @20341ms
  449. 23/10/02 06:47:31 INFO FuseDaemonServer: FuseDaemonServer started on 1023 with endpoint: '/get-unity-token'.
  450. 23/10/02 06:47:31 INFO SparkContext: Registered listener com.databricks.backend.daemon.driver.DBCEventLoggingListener
  451. 23/10/02 06:47:32 INFO DatabricksILoop$: Finished creating throwaway interpreter
  452. 23/10/02 06:47:32 INFO ContextHandler: Stopped o.e.j.s.ServletContextHandler@64b242b3{/,null,STOPPED,@Spark}
  453. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@29617475{/jobs,null,AVAILABLE,@Spark}
  454. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@353b5d86{/jobs/json,null,AVAILABLE,@Spark}
  455. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@43eea3bd{/jobs/job,null,AVAILABLE,@Spark}
  456. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7f7eeaaf{/jobs/job/json,null,AVAILABLE,@Spark}
  457. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@38499139{/stages,null,AVAILABLE,@Spark}
  458. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2ca93dee{/stages/json,null,AVAILABLE,@Spark}
  459. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2265a052{/stages/stage,null,AVAILABLE,@Spark}
  460. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@72090715{/stages/stage/json,null,AVAILABLE,@Spark}
  461. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@fcdeb50{/stages/pool,null,AVAILABLE,@Spark}
  462. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@65f1bf2c{/stages/pool/json,null,AVAILABLE,@Spark}
  463. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@286f8e90{/storage,null,AVAILABLE,@Spark}
  464. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4ea8832c{/storage/json,null,AVAILABLE,@Spark}
  465. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@11a9f958{/storage/rdd,null,AVAILABLE,@Spark}
  466. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7fbe0f64{/storage/rdd/json,null,AVAILABLE,@Spark}
  467. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@53b42a0d{/environment,null,AVAILABLE,@Spark}
  468. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@33c5d3e{/environment/json,null,AVAILABLE,@Spark}
  469. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@376b6d7d{/executors,null,AVAILABLE,@Spark}
  470. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6f272dc4{/executors/json,null,AVAILABLE,@Spark}
  471. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@28817763{/executors/threadDump,null,AVAILABLE,@Spark}
  472. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6abdead7{/executors/threadDump/json,null,AVAILABLE,@Spark}
  473. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6232cde5{/executors/heapHistogram,null,AVAILABLE,@Spark}
  474. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4e9b66ab{/executors/heapHistogram/json,null,AVAILABLE,@Spark}
  475. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@27828021{/static,null,AVAILABLE,@Spark}
  476. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4daa53e0{/,null,AVAILABLE,@Spark}
  477. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@42748f59{/api,null,AVAILABLE,@Spark}
  478. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@86e7c7a{/metrics,null,AVAILABLE,@Spark}
  479. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@22edca96{/jobs/job/kill,null,AVAILABLE,@Spark}
  480. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6e1bd2b{/stages/stage/kill,null,AVAILABLE,@Spark}
  481. 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@13f6395d{/metrics/json,null,AVAILABLE,@Spark}
  482. 23/10/02 06:47:32 INFO SparkContext: Loading Spark Service RPC Server. Classloader stack:List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@33d08a24, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@7906de2f, sun.misc.Launcher$AppClassLoader@3d299e3, sun.misc.Launcher$ExtClassLoader@b672aa8)
  483. 23/10/02 06:47:33 INFO SparkServiceRPCServer: Initializing Spark Service RPC Server. Classloader stack: List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@33d08a24, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@7906de2f, sun.misc.Launcher$AppClassLoader@3d299e3, sun.misc.Launcher$ExtClassLoader@b672aa8)
  484. 23/10/02 06:47:33 INFO SparkServiceRPCServer: Spark Service RPC Server is disabled.
  485. 23/10/02 06:47:33 INFO DatabricksILoop$: Successfully registered spark metrics in Prometheus registry
  486. 23/10/02 06:47:33 INFO DatabricksILoop$: Successfully initialized SparkContext
  487. 23/10/02 06:47:33 INFO SharedState: Scheduler stats enabled.
  488. 23/10/02 06:47:33 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir.
  489. 23/10/02 06:47:33 INFO SharedState: Warehouse path is 'dbfs:/user/hive/warehouse'.
  490. 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@295d1a45{/storage/iocache,null,AVAILABLE,@Spark}
  491. 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4b45016d{/storage/iocache/json,null,AVAILABLE,@Spark}
  492. 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@134ccb03{/SQL,null,AVAILABLE,@Spark}
  493. 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@12e25b4b{/SQL/json,null,AVAILABLE,@Spark}
  494. 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@139e291e{/SQL/execution,null,AVAILABLE,@Spark}
  495. 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2d58136{/SQL/execution/json,null,AVAILABLE,@Spark}
  496. 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6b8de9dc{/static/sql,null,AVAILABLE,@Spark}
  497. 23/10/02 06:47:33 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
  498. 23/10/02 06:47:33 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
  499. 23/10/02 06:47:37 INFO DriverConf: Configured feature flag data source LaunchDarkly
  500. 23/10/02 06:47:37 INFO DriverConf: Configured feature flag data source LaunchDarkly
  501. 23/10/02 06:47:37 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
  502. 23/10/02 06:47:39 INFO DatabricksMountsStore: Mount store initialization: Attempting to get the list of mounts from metadata manager of DBFS
  503. 23/10/02 06:47:39 INFO log: Logging initialized @28259ms to shaded.v9_4.org.eclipse.jetty.util.log.Slf4jLog
  504. 23/10/02 06:47:39 INFO DynamicRpcConf: Configured feature flag data source LaunchDarkly
  505. 23/10/02 06:47:39 INFO DynamicRpcConf: Configured feature flag data source LaunchDarkly
  506. 23/10/02 06:47:39 WARN DynamicRpcConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
  507. 23/10/02 06:47:40 INFO TypeUtil: JVM Runtime does not support Modules
  508. 23/10/02 06:47:40 INFO DatabricksMountsStore: Mount store initialization: Received a list of 9 mounts accessible from metadata manager of DBFS
  509. 23/10/02 06:47:40 INFO DatabricksMountsStore: Updated mounts cache. Changes: List((+,DbfsMountPoint(s3a://databricks-datasets-sydney/, /databricks-datasets)), (+,DbfsMountPoint(uc-volumes:/Volumes, /Volumes)), (+,DbfsMountPoint(unsupported-access-mechanism-for-path--use-mlflow-client:/, /databricks/mlflow-tracking)), (+,DbfsMountPoint(abfss://dbstorageelrh4j5j7bx2k.dfs.core.windows.net/4805034236521897, /databricks-results)), (+,DbfsMountPoint(unsupported-access-mechanism-for-path--use-mlflow-client:/, /databricks/mlflow-registry)), (+,DbfsMountPoint(dbfs-reserved-path:/uc-volumes-reserved, /Volume)), (+,DbfsMountPoint(dbfs-reserved-path:/uc-volumes-reserved, /volumes)), (+,DbfsMountPoint(abfss://dbstorageelrh4j5j7bx2k.dfs.core.windows.net/4805034236521897, /)), (+,DbfsMountPoint(dbfs-reserved-path:/uc-volumes-reserved, /volume)))
  510. 23/10/02 06:47:41 INFO DatabricksFileSystemV2Factory: Creating abfss file system for abfss://[email protected]
  511. 23/10/02 06:47:42 INFO AzureBlobFileSystem:V3: Initializing AzureBlobFileSystem for abfss://[email protected]/4805034236521897 with credential = FixedSASTokenProvider with jvmId = 491
  512. 23/10/02 06:47:42 INFO DbfsHadoop3: Initialized DBFS with DBFSV2 as the delegate.
  513. 23/10/02 06:47:42 INFO HiveConf: Found configuration file file:/databricks/hive/conf/hive-site.xml
  514. 23/10/02 06:47:42 INFO SessionManager: HiveServer2: Background operation thread pool size: 100
  515. 23/10/02 06:47:42 INFO SessionManager: HiveServer2: Background operation thread wait queue size: 100
  516. 23/10/02 06:47:42 INFO SessionManager: HiveServer2: Background operation thread keepalive time: 10 seconds
  517. 23/10/02 06:47:42 INFO AbstractService: Service:OperationManager is inited.
  518. 23/10/02 06:47:42 INFO AbstractService: Service:SessionManager is inited.
  519. 23/10/02 06:47:42 INFO SparkSQLCLIService: Service: CLIService is inited.
  520. 23/10/02 06:47:42 INFO AbstractService: Service:ThriftHttpCLIService is inited.
  521. 23/10/02 06:47:42 INFO HiveThriftServer2: Service: HiveServer2 is inited.
  522. 23/10/02 06:47:42 INFO AbstractService: Service:OperationManager is started.
  523. 23/10/02 06:47:42 INFO AbstractService: Service:SessionManager is started.
  524. 23/10/02 06:47:42 INFO SparkSQLCLIService: Service: CLIService is started.
  525. 23/10/02 06:47:42 INFO AbstractService: Service:ThriftHttpCLIService is started.
  526. 23/10/02 06:47:42 INFO ThriftCLIService: HTTP Server SSL: adding excluded protocols: [SSLv2, SSLv3]
  527. 23/10/02 06:47:42 INFO ThriftCLIService: HTTP Server SSL: SslContextFactory.getExcludeProtocols = [SSL, SSLv2, SSLv2Hello, SSLv3]
  528. 23/10/02 06:47:42 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
  529. 23/10/02 06:47:42 INFO session: DefaultSessionIdManager workerName=node0
  530. 23/10/02 06:47:42 INFO session: No SessionScavenger set, using defaults
  531. 23/10/02 06:47:43 INFO session: node0 Scavenging every 600000ms
  532. 23/10/02 06:47:43 WARN SecurityHandler: [email protected]@2e5f2387{/,null,STARTING} has uncovered http methods for path: /*
  533. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2e5f2387{/,null,AVAILABLE}
  534. 23/10/02 06:47:43 INFO SslContextFactory: x509=X509@747d4d9a(1,h=[az-australiaeast.workers.prod.ns.databricks.com],a=[],w=[]) for Server@2982f1eb[provider=null,keyStore=file:///databricks/keys/jetty-ssl-driver-keystore.jks,trustStore=null]
  535. 23/10/02 06:47:43 INFO AbstractConnector: Started ServerConnector@50a4e4e{SSL, (ssl, http/1.1)}{0.0.0.0:10000}
  536. 23/10/02 06:47:43 INFO Server: Started @31573ms
  537. 23/10/02 06:47:43 INFO ThriftCLIService: Started ThriftHttpCLIService in https mode on port 10000 path=/cliservice/* with 5...500 worker threads
  538. 23/10/02 06:47:43 INFO AbstractService: Service:HiveServer2 is started.
  539. 23/10/02 06:47:43 INFO HiveThriftServer2: HiveThriftServer2 started
  540. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2324100b{/sqlserver,null,AVAILABLE,@Spark}
  541. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@32c77e0d{/sqlserver/json,null,AVAILABLE,@Spark}
  542. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@399b9537{/sqlserver/session,null,AVAILABLE,@Spark}
  543. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7d4f7aab{/sqlserver/session/json,null,AVAILABLE,@Spark}
  544. 23/10/02 06:47:43 INFO LibraryResolutionManager: Preferred maven central mirror is configured to https://maven-central.storage-download.googleapis.com/maven2/
  545. 23/10/02 06:47:43 WARN OutgoingDirectNotebookBufferRateLimiter$: No value specified for db-outgoing-buffer-throttler-burst. Using default: 100000000000
  546. 23/10/02 06:47:43 WARN OutgoingDirectNotebookBufferRateLimiter$: No value specified for db-outgoing-buffer-throttler-steady-rate. Using default: 6000000000
  547. 23/10/02 06:47:43 WARN OutgoingDirectNotebookBufferRateLimiter$: No value specified for db-outgoing-buffer-throttler-warning-interval-sec. Using default: 60
  548. 23/10/02 06:47:43 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
  549. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7c9745ae{/StreamingQuery,null,AVAILABLE,@Spark}
  550. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5b481d77{/StreamingQuery/json,null,AVAILABLE,@Spark}
  551. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@42acca1a{/StreamingQuery/statistics,null,AVAILABLE,@Spark}
  552. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4a26a54b{/StreamingQuery/statistics/json,null,AVAILABLE,@Spark}
  553. 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7d151d89{/static/sql,null,AVAILABLE,@Spark}
  554. 23/10/02 06:47:43 INFO JettyServer$: Creating thread pool with name ...
  555. 23/10/02 06:47:43 INFO JettyServer$: Thread pool created
  556. 23/10/02 06:47:43 INFO JettyServer$: Creating thread pool with name ...
  557. 23/10/02 06:47:43 INFO JettyServer$: Thread pool created
  558. 23/10/02 06:47:43 INFO DriverDaemon: Starting driver daemon...
  559. 23/10/02 06:47:43 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.password, old value Some(), new value [REDACTED], new value will take effect.
  560. 23/10/02 06:47:43 WARN SparkConfUtils$: Setting the same key twice for spark.databricks.io.directoryCommit.enableLogicalDelete, old value Some(false), new value false, new value will take effect.
  561. 23/10/02 06:47:43 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.path, old value Some(/databricks/keys/jetty_ssl_driver_keystore.jks), new value /databricks/keys/jetty-ssl-driver-keystore.jks, new value will take effect.
  562. 23/10/02 06:47:43 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf
  563. 23/10/02 06:47:43 INFO DriverDaemon$: Attempting to run: 'set up ttyd daemon'
  564. 23/10/02 06:47:43 INFO DriverDaemon$: Attempting to run: 'Configuring RStudio daemon'
  565. 23/10/02 06:47:43 INFO DriverDaemon$: Resetting the default python executable
  566. 23/10/02 06:47:43 INFO Utils: resolved command to be run: List(virtualenv, /local_disk0/.ephemeral_nfs/cluster_libraries/python, -p, /databricks/python/bin/python, --no-download, --no-setuptools, --no-wheel)
  567. 23/10/02 06:47:44 INFO PythonEnvCloneHelper$: Created python virtualenv: /local_disk0/.ephemeral_nfs/cluster_libraries/python
  568. 23/10/02 06:47:44 INFO Utils: resolved command to be run: List(/databricks/python/bin/python, -c, import sys; dirs=[p for p in sys.path if 'package' in p]; print(' '.join(dirs)))
  569. 23/10/02 06:47:45 INFO Utils: resolved command to be run: List(/local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/python, -c, from sysconfig import get_path; print(get_path('purelib')))
  570. 23/10/02 06:47:45 INFO PythonEnvCloneHelper$: Created sites.pth at /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.10/site-packages/sites.pth
  571. 23/10/02 06:47:45 INFO ClusterWidePythonEnvManager: Registered /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.10/site-packages with the WatchService sun.nio.fs.LinuxWatchService$LinuxWatchKey@77d68b94
  572. 23/10/02 06:47:45 INFO DriverDaemon$: Attempting to run: 'Update root virtualenv'
  573. 23/10/02 06:47:45 INFO DriverDaemon$: Finished updating /etc/environment
  574. 23/10/02 06:47:45 INFO DriverDaemon$$anon$1: Message out thread ready
  575. 23/10/02 06:47:45 INFO NetstatUtil$: Running netstat -lnpt
  576. 23/10/02 06:47:45 INFO NetstatUtil$: netstat -lnpt
  577. Active Internet connections (only servers)
  578. Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name
  579. tcp 0 0 0.0.0.0:7681 0.0.0.0:* LISTEN 678/ttyd
  580. tcp 0 0 127.0.0.53:53 0.0.0.0:* LISTEN 55/systemd-resolved
  581. tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN 71/sshd: /usr/sbin/
  582. tcp6 0 0 10.139.64.4:44035 :::* LISTEN 491/java
  583. tcp6 0 0 :::6060 :::* LISTEN 339/java
  584. tcp6 0 0 :::15002 :::* LISTEN 491/java
  585. tcp6 0 0 :::7071 :::* LISTEN 339/java
  586. tcp6 0 0 10.139.64.4:41225 :::* LISTEN 491/java
  587. tcp6 0 0 10.139.64.4:40001 :::* LISTEN 491/java
  588. tcp6 0 0 :::1017 :::* LISTEN 155/wsfs
  589. tcp6 0 0 :::1021 :::* LISTEN 155/wsfs
  590. tcp6 0 0 :::1023 :::* LISTEN 491/java
  591. tcp6 0 0 :::1015 :::* LISTEN 176/goofys-dbr
  592. tcp6 0 0 :::22 :::* LISTEN 71/sshd: /usr/sbin/
  593. tcp6 0 0 :::10000 :::* LISTEN 491/java
  594.  
  595. 23/10/02 06:47:45 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
  596. 23/10/02 06:47:45 INFO AbstractConnector: Started ServerConnector@48c2391{HTTP/1.1, (http/1.1)}{0.0.0.0:6061}
  597. 23/10/02 06:47:45 INFO Server: Started @34088ms
  598. 23/10/02 06:47:45 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
  599. 23/10/02 06:47:45 INFO SslContextFactory: x509=X509@688d3e2a(1,h=[az-australiaeast.workers.prod.ns.databricks.com],a=[],w=[]) for Server@4fc2e703[provider=null,keyStore=null,trustStore=null]
  600. 23/10/02 06:47:45 WARN config: Weak cipher suite TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA enabled for Server@4fc2e703[provider=null,keyStore=null,trustStore=null]
  601. 23/10/02 06:47:45 WARN config: Weak cipher suite TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA enabled for Server@4fc2e703[provider=null,keyStore=null,trustStore=null]
  602. 23/10/02 06:47:45 INFO AbstractConnector: Started ServerConnector@2cb34e1e{SSL, (ssl, http/1.1)}{0.0.0.0:6062}
  603. 23/10/02 06:47:45 INFO Server: Started @34181ms
  604. 23/10/02 06:47:45 INFO DriverDaemon: Started comm channel server
  605. 23/10/02 06:47:45 INFO DriverDaemon: Driver daemon started.
  606. 23/10/02 06:47:45 INFO DynamicInfoServiceConf: Configured feature flag data source LaunchDarkly
  607. 23/10/02 06:47:45 INFO DynamicInfoServiceConf: Configured feature flag data source LaunchDarkly
  608. 23/10/02 06:47:45 WARN DynamicInfoServiceConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
  609. 23/10/02 06:47:47 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
  610. 23/10/02 06:47:47 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
  611. 23/10/02 06:47:47 INFO DriverCorral: Loading the root classloader
  612. 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-a20c4-6dc3c-e17de
  613. 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-61ca1-d7420-ce2cf-b
  614. 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-38b1a-f400f-5b2ff-3
  615. 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-7ec5d-8a797-40808-0
  616. 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-1f405-90a42-fa0c2-4
  617. 23/10/02 06:47:47 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
  618. 23/10/02 06:47:47 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
  619. 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-38b1a-f400f-5b2ff-3: finished to load
  620. 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-7ec5d-8a797-40808-0: finished to load
  621. 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-1f405-90a42-fa0c2-4: finished to load
  622. 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-61ca1-d7420-ce2cf-b: finished to load
  623. 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-a20c4-6dc3c-e17de: finished to load
  624. 23/10/02 06:47:47 INFO DriverCorral: Starting r repl ReplId-62225-c3991-b74c2-9
  625. 23/10/02 06:47:47 INFO ROutputStreamHandler: Connection succeeded on port 34891
  626. 23/10/02 06:47:47 INFO ROutputStreamHandler: Connection succeeded on port 33679
  627. 23/10/02 06:47:47 INFO RDriverLocal: 1. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: object created with for ReplId-62225-c3991-b74c2-9.
  628. 23/10/02 06:47:47 INFO RDriverLocal: 2. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: initializing ...
  629. 23/10/02 06:47:47 INFO RDriverLocal: 3. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: started RBackend thread on port 36019
  630. 23/10/02 06:47:47 INFO RDriverLocal: 4. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: waiting for SparkR to be installed ...
  631. 23/10/02 06:47:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  632. 23/10/02 06:47:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  633. 23/10/02 06:47:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  634. 23/10/02 06:47:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  635. 23/10/02 06:48:00 INFO RDriverLocal$: SparkR installation completed.
  636. 23/10/02 06:48:00 INFO RDriverLocal: 5. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: launching R process ...
  637. 23/10/02 06:48:00 INFO RDriverLocal: 6. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: cgroup isolation disabled, not placing R process in REPL cgroup.
  638. 23/10/02 06:48:00 INFO RDriverLocal: 7. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: starting R process on port 1100 (attempt 1) ...
  639. 23/10/02 06:48:00 INFO RDriverLocal$: Debugging command for R process builder: SIMBASPARKINI=/etc/simba.sparkodbc.ini R_LIBS=/local_disk0/.ephemeral_nfs/envs/rEnv-f23941ad-7dc9-49bd-86b0-f366cff935f3:/databricks/spark/R/lib:/local_disk0/.ephemeral_nfs/cluster_libraries/r LD_LIBRARY_PATH=/opt/simba/sparkodbc/lib/64/ SPARKR_BACKEND_CONNECTION_TIMEOUT=604800 DB_STREAM_BEACON_STRING_START=DATABRICKS_STREAM_START-ReplId-62225-c3991-b74c2-9 DB_STDOUT_STREAM_PORT=34891 SPARKR_BACKEND_AUTH_SECRET=a2319d90d7b0c12aab2a4bd2d5753351f0c109dbdb5a35c39c2c236115dda062 DB_STREAM_BEACON_STRING_END=DATABRICKS_STREAM_END-ReplId-62225-c3991-b74c2-9 EXISTING_SPARKR_BACKEND_PORT=36019 ODBCINI=/etc/odbc.ini DB_STDERR_STREAM_PORT=33679 /bin/bash /local_disk0/tmp/_startR.sh1893372858473036520resource.r /local_disk0/tmp/_rServeScript.r7342418338649097687resource.r 1100 None
  640. 23/10/02 06:48:00 INFO RDriverLocal: 8. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: setting up BufferedStreamThread with bufferSize: 1000.
  641. 23/10/02 06:48:02 INFO RDriverLocal: 9. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: R process started with RServe listening on port 1100.
  642. 23/10/02 06:48:02 INFO RDriverLocal: 10. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: starting interpreter to talk to R process ...
  643. 23/10/02 06:48:03 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
  644. 23/10/02 06:48:03 INFO ROutputStreamHandler: Successfully connected to stdout in the RShell.
  645. 23/10/02 06:48:03 INFO ROutputStreamHandler: Successfully connected to stderr in the RShell.
  646. 23/10/02 06:48:03 INFO RDriverLocal: 11. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: R interpreter is connected.
  647. 23/10/02 06:48:03 INFO RDriverWrapper: setupRepl:ReplId-62225-c3991-b74c2-9: finished to load
  648. 23/10/02 06:48:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  649. 23/10/02 06:48:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  650. 23/10/02 06:48:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  651. 23/10/02 06:48:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  652. 23/10/02 06:48:25 INFO EventLoggingStats: UsageLog|1696229220000|1696229250000|2104|3|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
  653. 23/10/02 06:48:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  654. 23/10/02 06:48:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  655. 23/10/02 06:48:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  656. 23/10/02 06:48:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  657. 23/10/02 06:48:34 WARN DriverDaemon: ShouldUseAutoscalingInfo exception thrown, not logging stack trace. This is used for control flow and is ok to ignore
  658. 23/10/02 06:48:43 ERROR CommandLineHelper$: Command [REDACTED] failed with exit code 1 out: err:
  659. java.lang.RuntimeException: CommandLineHelper exception - stack trace
  660. at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:523)
  661. at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:418)
  662. at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$new$6(DriverCorral.scala:389)
  663. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  664. at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
  665. at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:666)
  666. at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:684)
  667. at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
  668. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
  669. at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
  670. at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
  671. at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
  672. at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
  673. at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
  674. at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
  675. at com.databricks.threading.NamedTimer$$anon$1.withAttributionTags(NamedTimer.scala:95)
  676. at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:661)
  677. at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
  678. at com.databricks.threading.NamedTimer$$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
  679. at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
  680. at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:540)
  681. at com.databricks.threading.NamedTimer$$anon$1.recordOperation(NamedTimer.scala:95)
  682. at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$2(NamedTimer.scala:104)
  683. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  684. at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
  685. at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
  686. at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
  687. at com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1420)
  688. at com.databricks.logging.UsageLogging.disableTracing$(UsageLogging.scala:1419)
  689. at com.databricks.threading.NamedTimer$$anon$1.disableTracing(NamedTimer.scala:95)
  690. at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$1(NamedTimer.scala:103)
  691. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  692. at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
  693. at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
  694. at java.util.TimerThread.mainLoop(Timer.java:555)
  695. at java.util.TimerThread.run(Timer.java:505)
  696. 23/10/02 06:48:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  697. 23/10/02 06:48:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  698. 23/10/02 06:48:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  699. 23/10/02 06:48:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  700. 23/10/02 06:48:52 WARN DriverDaemon: ShouldUseAutoscalingInfo exception thrown, not logging stack trace. This is used for control flow and is ok to ignore
  701. 23/10/02 06:48:55 INFO EventLoggingStats: UsageLog|1696229250000|1696229280000|184882|24|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
  702. 23/10/02 06:49:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  703. 23/10/02 06:49:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  704. 23/10/02 06:49:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  705. 23/10/02 06:49:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  706. 23/10/02 06:49:25 INFO EventLoggingStats: UsageLog|1696229280000|1696229310000|64928|15|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
  707. 23/10/02 06:49:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  708. 23/10/02 06:49:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  709. 23/10/02 06:49:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  710. 23/10/02 06:49:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  711. 23/10/02 06:49:43 ERROR CommandLineHelper$: Command [REDACTED] failed with exit code 1 out: err:
  712. java.lang.RuntimeException: CommandLineHelper exception - stack trace
  713. at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:523)
  714. at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:418)
  715. at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$new$6(DriverCorral.scala:389)
  716. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  717. at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
  718. at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:666)
  719. at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:684)
  720. at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
  721. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
  722. at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
  723. at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
  724. at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
  725. at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
  726. at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
  727. at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
  728. at com.databricks.threading.NamedTimer$$anon$1.withAttributionTags(NamedTimer.scala:95)
  729. at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:661)
  730. at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
  731. at com.databricks.threading.NamedTimer$$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
  732. at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
  733. at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:540)
  734. at com.databricks.threading.NamedTimer$$anon$1.recordOperation(NamedTimer.scala:95)
  735. at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$2(NamedTimer.scala:104)
  736. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  737. at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
  738. at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
  739. at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
  740. at com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1420)
  741. at com.databricks.logging.UsageLogging.disableTracing$(UsageLogging.scala:1419)
  742. at com.databricks.threading.NamedTimer$$anon$1.disableTracing(NamedTimer.scala:95)
  743. at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$1(NamedTimer.scala:103)
  744. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  745. at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
  746. at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
  747. at java.util.TimerThread.mainLoop(Timer.java:555)
  748. at java.util.TimerThread.run(Timer.java:505)
  749. 23/10/02 06:49:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  750. 23/10/02 06:49:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  751. 23/10/02 06:49:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  752. 23/10/02 06:49:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  753. 23/10/02 06:49:55 INFO EventLoggingStats: UsageLog|1696229310000|1696229340000|76526|17|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
  754. 23/10/02 06:50:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  755. 23/10/02 06:50:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  756. 23/10/02 06:50:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  757. 23/10/02 06:50:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  758. 23/10/02 06:50:25 INFO EventLoggingStats: UsageLog|1696229340000|1696229370000|64928|15|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
  759. 23/10/02 06:50:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  760. 23/10/02 06:50:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  761. 23/10/02 06:50:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  762. 23/10/02 06:50:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  763. 23/10/02 06:50:43 ERROR CommandLineHelper$: Command [REDACTED] failed with exit code 1 out: err:
  764. java.lang.RuntimeException: CommandLineHelper exception - stack trace
  765. at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:523)
  766. at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:418)
  767. at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$new$6(DriverCorral.scala:389)
  768. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  769. at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
  770. at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:666)
  771. at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:684)
  772. at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
  773. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
  774. at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
  775. at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
  776. at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
  777. at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
  778. at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
  779. at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
  780. at com.databricks.threading.NamedTimer$$anon$1.withAttributionTags(NamedTimer.scala:95)
  781. at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:661)
  782. at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
  783. at com.databricks.threading.NamedTimer$$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
  784. at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
  785. at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:540)
  786. at com.databricks.threading.NamedTimer$$anon$1.recordOperation(NamedTimer.scala:95)
  787. at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$2(NamedTimer.scala:104)
  788. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  789. at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
  790. at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
  791. at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
  792. at com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1420)
  793. at com.databricks.logging.UsageLogging.disableTracing$(UsageLogging.scala:1419)
  794. at com.databricks.threading.NamedTimer$$anon$1.disableTracing(NamedTimer.scala:95)
  795. at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$1(NamedTimer.scala:103)
  796. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  797. at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
  798. at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
  799. at java.util.TimerThread.mainLoop(Timer.java:555)
  800. at java.util.TimerThread.run(Timer.java:505)
  801. 23/10/02 06:50:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  802. 23/10/02 06:50:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  803. 23/10/02 06:50:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  804. 23/10/02 06:50:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  805. 23/10/02 06:50:55 INFO EventLoggingStats: UsageLog|1696229370000|1696229400000|64928|15|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
  806. 23/10/02 06:51:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  807. 23/10/02 06:51:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  808. 23/10/02 06:51:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  809. 23/10/02 06:51:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  810. 23/10/02 06:51:25 INFO EventLoggingStats: UsageLog|1696229400000|1696229430000|64932|15|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
  811. 23/10/02 06:51:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  812. 23/10/02 06:51:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  813. 23/10/02 06:51:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
  814. 23/10/02 06:51:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
  815. 23/10/02 06:51:43 ERROR CommandLineHelper$: Command [REDACTED] failed with exit code 1 out: err:
  816. java.lang.RuntimeException: CommandLineHelper exception - stack trace
  817. at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:523)
  818. at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:418)
  819. at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$new$6(DriverCorral.scala:389)
  820. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  821. at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
  822. at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:666)
  823. at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:684)
  824. at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
  825. at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
  826. at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
  827. at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
  828. at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
  829. at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
  830. at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
  831. at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
  832. at com.databricks.threading.NamedTimer$$anon$1.withAttributionTags(NamedTimer.scala:95)
  833. at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:661)
  834. at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
  835. at com.databricks.threading.NamedTimer$$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
  836. at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
  837. at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:540)
  838. at com.databricks.threading.NamedTimer$$anon$1.recordOperation(NamedTimer.scala:95)
  839. at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$2(NamedTimer.scala:104)
  840. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  841. at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
  842. at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
  843. at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
  844. at com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1420)
  845. at com.databricks.logging.UsageLogging.disableTracing$(UsageLogging.scala:1419)
  846. at com.databricks.threading.NamedTimer$$anon$1.disableTracing(NamedTimer.scala:95)
  847. at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$1(NamedTimer.scala:103)
  848. at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
  849. at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
  850. at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
  851. at java.util.TimerThread.mainLoop(Timer.java:555)
  852. at java.util.TimerThread.run(Timer.java:505)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement