Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 23/10/02 06:47:20 INFO DriverDaemon$: Started Log4j2
- 23/10/02 06:47:23 INFO DriverDaemon$: Current JVM Version 1.8.0_372
- 23/10/02 06:47:24 INFO DriverDaemon$: ========== driver starting up ==========
- 23/10/02 06:47:24 INFO DriverDaemon$: Java: Azul Systems, Inc. 1.8.0_372
- 23/10/02 06:47:24 INFO DriverDaemon$: OS: Linux/amd64 5.15.0-1042-azure
- 23/10/02 06:47:24 INFO DriverDaemon$: CWD: /databricks/driver
- 23/10/02 06:47:24 INFO DriverDaemon$: Mem: Max: 6.3G loaded GCs: PS Scavenge, PS MarkSweep
- 23/10/02 06:47:24 INFO DriverDaemon$: Logging multibyte characters: ✓
- 23/10/02 06:47:24 INFO DriverDaemon$: 'publicFile.rolling.rewrite' appender in root logger: class org.apache.logging.log4j.core.appender.rewrite.RewriteAppender
- 23/10/02 06:47:24 INFO DriverDaemon$: == Modules:
- 23/10/02 06:47:24 INFO DynamicLoggingConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:24 INFO DynamicLoggingConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:24 WARN DynamicLoggingConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
- 23/10/02 06:47:24 INFO FeatureFlagRegisterConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:24 INFO FeatureFlagRegisterConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:24 WARN FeatureFlagRegisterConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
- 23/10/02 06:47:25 INFO DriverDaemon$: Starting prometheus metrics log export timer
- 23/10/02 06:47:25 INFO DriverConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:25 INFO DriverConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:25 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
- 23/10/02 06:47:25 INFO DriverDaemon$: Loaded JDBC drivers in 120 ms
- 23/10/02 06:47:25 INFO DriverDaemon$: Universe Git Hash: 4bfecaa31575b040f75ca7a4a539c6692fdc153c
- 23/10/02 06:47:25 INFO DriverDaemon$: Spark Git Hash: c3998998144bf322f53c9b6c4192ee636b4aa1ed
- 23/10/02 06:47:25 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.password, old value Some(), new value gb1gQqZ9ZIHS, new value will take effect.
- 23/10/02 06:47:25 WARN SparkConfUtils$: Setting the same key twice for spark.databricks.io.directoryCommit.enableLogicalDelete, old value Some(false), new value false, new value will take effect.
- 23/10/02 06:47:25 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.path, old value Some(/databricks/keys/jetty_ssl_driver_keystore.jks), new value /databricks/keys/jetty-ssl-driver-keystore.jks, new value will take effect.
- 23/10/02 06:47:25 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf
- 23/10/02 06:47:25 WARN RunHelpers$: Missing tag isolation client: java.util.NoSuchElementException: key not found: TagDefinition(clientType,The client type for a request, used for isolating resources for the request.,DATA_LABEL_SYSTEM_NOT_SENSITIVE,false,false,List(),UsageLogRedactionConfig(List()))
- 23/10/02 06:47:25 INFO DatabricksILoop$: Creating throwaway interpreter
- 23/10/02 06:47:25 INFO MetastoreMonitor$: Internal metastore configured
- 23/10/02 06:47:25 INFO DataSourceFactory$: DataSource Jdbc URL: jdbc:mariadb://consolidated-australiaeast-prod-metastore-addl-1.mysql.database.azure.com:3306/organization4805034236521897?useSSL=true&sslMode=VERIFY_CA&disableSslHostnameVerification=true&trustServerCertificate=false&serverSslCert=/databricks/common/mysql-ssl-ca-cert.crt
- 23/10/02 06:47:25 INFO NestedConnectionMonitor$$anon$1: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:25 INFO NestedConnectionMonitor$$anon$1: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:25 WARN NestedConnectionMonitor$$anon$1: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
- 23/10/02 06:47:26 INFO DriverCorral: Creating the driver context
- 23/10/02 06:47:26 INFO DatabricksILoop$: Class Server Dir: /local_disk0/tmp/repl/spark-7666683401531837394-987aa9b6-0e13-4dde-8745-8bb756ba9bd3
- 23/10/02 06:47:26 INFO HikariDataSource: metastore-monitor - Starting...
- 23/10/02 06:47:26 INFO HikariDataSource: metastore-monitor - Start completed.
- 23/10/02 06:47:26 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.password, old value Some(), new value gb1gQqZ9ZIHS, new value will take effect.
- 23/10/02 06:47:26 WARN SparkConfUtils$: Setting the same key twice for spark.databricks.io.directoryCommit.enableLogicalDelete, old value Some(false), new value false, new value will take effect.
- 23/10/02 06:47:26 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.path, old value Some(/databricks/keys/jetty_ssl_driver_keystore.jks), new value /databricks/keys/jetty-ssl-driver-keystore.jks, new value will take effect.
- 23/10/02 06:47:26 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf
- 23/10/02 06:47:26 INFO SparkContext: Running Spark version 3.4.0
- 23/10/02 06:47:26 INFO DatabricksEdgeConfigs: serverlessEnabled : false
- 23/10/02 06:47:26 INFO DatabricksEdgeConfigs: perfPackEnabled : true
- 23/10/02 06:47:26 INFO DatabricksEdgeConfigs: classicSqlEnabled : true
- 23/10/02 06:47:26 INFO HikariDataSource: metastore-monitor - Shutdown initiated...
- 23/10/02 06:47:26 INFO HikariDataSource: metastore-monitor - Shutdown completed.
- 23/10/02 06:47:26 INFO MetastoreMonitor: Metastore healthcheck successful (connection duration = 1253 milliseconds)
- 23/10/02 06:47:27 INFO ResourceUtils: ==============================================================
- 23/10/02 06:47:27 INFO ResourceUtils: No custom resources configured for spark.driver.
- 23/10/02 06:47:27 INFO ResourceUtils: ==============================================================
- 23/10/02 06:47:27 INFO SparkContext: Submitted application: Databricks Shell
- 23/10/02 06:47:27 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1821, script: , vendor: , offHeap -> name: offHeap, amount: 5463, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
- 23/10/02 06:47:27 INFO ResourceProfile: Limiting resource is cpu
- 23/10/02 06:47:27 INFO ResourceProfileManager: Added ResourceProfile id: 0
- 23/10/02 06:47:27 INFO SecurityManager: Changing view acls to: root
- 23/10/02 06:47:27 INFO SecurityManager: Changing modify acls to: root
- 23/10/02 06:47:27 INFO SecurityManager: Changing view acls groups to:
- 23/10/02 06:47:27 INFO SecurityManager: Changing modify acls groups to:
- 23/10/02 06:47:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: root; groups with view permissions: EMPTY; users with modify permissions: root; groups with modify permissions: EMPTY
- 23/10/02 06:47:27 INFO Utils: Successfully started service 'sparkDriver' on port 44035.
- 23/10/02 06:47:27 INFO SparkEnv: Registering MapOutputTracker
- 23/10/02 06:47:27 INFO SparkEnv: Registering BlockManagerMaster
- 23/10/02 06:47:27 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
- 23/10/02 06:47:27 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
- 23/10/02 06:47:27 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
- 23/10/02 06:47:28 INFO DiskBlockManager: Created local directory at /local_disk0/blockmgr-ae7e8010-d904-42e2-8ca1-21d909588411
- 23/10/02 06:47:28 INFO MemoryStore: MemoryStore started with capacity 8.7 GiB
- 23/10/02 06:47:28 INFO SparkEnv: Registering OutputCommitCoordinator
- 23/10/02 06:47:28 INFO DriverConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:28 INFO DriverConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:28 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
- 23/10/02 06:47:28 INFO SparkContext: Spark configuration:
- eventLog.rolloverIntervalSeconds=900
- libraryDownload.sleepIntervalSeconds=5
- libraryDownload.timeoutSeconds=180
- spark.akka.frameSize=256
- spark.app.name=Databricks Shell
- spark.app.startTime=1696229246446
- spark.cleaner.referenceTracking.blocking=false
- spark.databricks.acl.client=com.databricks.spark.sql.acl.client.SparkSqlAclClient
- spark.databricks.acl.provider=com.databricks.sql.acl.ReflectionBackedAclProvider
- spark.databricks.acl.scim.client=com.databricks.spark.sql.acl.client.DriverToWebappScimClient
- spark.databricks.automl.serviceEnabled=true
- spark.databricks.autotune.maintenance.client.classname=com.databricks.maintenanceautocompute.MACClientImpl
- spark.databricks.cloudProvider=Azure
- spark.databricks.cloudfetch.hasRegionSupport=true
- spark.databricks.cloudfetch.requesterClassName=*********(redacted)
- spark.databricks.cluster.profile=singleNode
- spark.databricks.clusterSource=UI
- spark.databricks.clusterUsageTags.attribute_tag_budget=
- spark.databricks.clusterUsageTags.attribute_tag_dust_execution_env=
- spark.databricks.clusterUsageTags.attribute_tag_dust_maintainer=
- spark.databricks.clusterUsageTags.attribute_tag_dust_suite=
- spark.databricks.clusterUsageTags.attribute_tag_service=
- spark.databricks.clusterUsageTags.autoTerminationMinutes=30
- spark.databricks.clusterUsageTags.azureSubscriptionId=f9fbc78c-f20e-44ba-ad50-a7e15c43a004
- spark.databricks.clusterUsageTags.cloudProvider=Azure
- spark.databricks.clusterUsageTags.clusterAllTags=[{"key":"ResourceClass","value":"SingleNode"},{"key":"Vendor","value":"Databricks"},{"key":"Creator","value":"[email protected]"},{"key":"ClusterName","value":"Dan Corneanu's Cluster"},{"key":"ClusterId","value":"1002-064418-7lm8zo61"},{"key":"DatabricksEnvironment","value":"workerenv-4805034236521897"}]
- spark.databricks.clusterUsageTags.clusterAvailability=ON_DEMAND_AZURE
- spark.databricks.clusterUsageTags.clusterCreator=Webapp
- spark.databricks.clusterUsageTags.clusterFirstOnDemand=1
- spark.databricks.clusterUsageTags.clusterGeneration=0
- spark.databricks.clusterUsageTags.clusterId=1002-064418-7lm8zo61
- spark.databricks.clusterUsageTags.clusterLogDeliveryEnabled=false
- spark.databricks.clusterUsageTags.clusterLogDestination=
- spark.databricks.clusterUsageTags.clusterMetastoreAccessType=RDS_DIRECT
- spark.databricks.clusterUsageTags.clusterName=Dan Corneanu's Cluster
- spark.databricks.clusterUsageTags.clusterNoDriverDaemon=false
- spark.databricks.clusterUsageTags.clusterNodeType=Standard_DS3_v2
- spark.databricks.clusterUsageTags.clusterNumCustomTags=1
- spark.databricks.clusterUsageTags.clusterNumSshKeys=0
- spark.databricks.clusterUsageTags.clusterOwnerOrgId=4805034236521897
- spark.databricks.clusterUsageTags.clusterOwnerUserId=*********(redacted)
- spark.databricks.clusterUsageTags.clusterPinned=false
- spark.databricks.clusterUsageTags.clusterPythonVersion=3
- spark.databricks.clusterUsageTags.clusterResourceClass=SingleNode
- spark.databricks.clusterUsageTags.clusterScalingType=fixed_size
- spark.databricks.clusterUsageTags.clusterSizeType=VM_CONTAINER
- spark.databricks.clusterUsageTags.clusterSku=STANDARD_SKU
- spark.databricks.clusterUsageTags.clusterSpotBidMaxPrice=-1.0
- spark.databricks.clusterUsageTags.clusterState=Pending
- spark.databricks.clusterUsageTags.clusterStateMessage=Starting Spark
- spark.databricks.clusterUsageTags.clusterTargetWorkers=0
- spark.databricks.clusterUsageTags.clusterUnityCatalogMode=*********(redacted)
- spark.databricks.clusterUsageTags.clusterWorkers=0
- spark.databricks.clusterUsageTags.containerType=LXC
- spark.databricks.clusterUsageTags.dataPlaneRegion=australiaeast
- spark.databricks.clusterUsageTags.driverContainerId=71164545dd3c471999e46f4a80a5e47f
- spark.databricks.clusterUsageTags.driverContainerPrivateIp=10.139.64.4
- spark.databricks.clusterUsageTags.driverInstanceId=c5cec963d17d46699e2b99f577177881
- spark.databricks.clusterUsageTags.driverInstancePrivateIp=10.139.0.4
- spark.databricks.clusterUsageTags.driverNodeType=Standard_DS3_v2
- spark.databricks.clusterUsageTags.driverPublicDns=20.28.245.184
- spark.databricks.clusterUsageTags.effectiveSparkVersion=13.2.x-photon-scala2.12
- spark.databricks.clusterUsageTags.enableCredentialPassthrough=*********(redacted)
- spark.databricks.clusterUsageTags.enableDfAcls=false
- spark.databricks.clusterUsageTags.enableElasticDisk=true
- spark.databricks.clusterUsageTags.enableGlueCatalogCredentialPassthrough=*********(redacted)
- spark.databricks.clusterUsageTags.enableJdbcAutoStart=true
- spark.databricks.clusterUsageTags.enableJobsAutostart=true
- spark.databricks.clusterUsageTags.enableLocalDiskEncryption=false
- spark.databricks.clusterUsageTags.enableSqlAclsOnly=false
- spark.databricks.clusterUsageTags.hailEnabled=false
- spark.databricks.clusterUsageTags.ignoreTerminationEventInAlerting=false
- spark.databricks.clusterUsageTags.instanceWorkerEnvId=workerenv-4805034236521897
- spark.databricks.clusterUsageTags.instanceWorkerEnvNetworkType=default
- spark.databricks.clusterUsageTags.isDpCpPrivateLinkEnabled=false
- spark.databricks.clusterUsageTags.isIMv2Enabled=true
- spark.databricks.clusterUsageTags.isServicePrincipalCluster=false
- spark.databricks.clusterUsageTags.isSingleUserCluster=*********(redacted)
- spark.databricks.clusterUsageTags.managedResourceGroup=mrg-databricks-dan
- spark.databricks.clusterUsageTags.ngrokNpipEnabled=false
- spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2=0
- spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Abfss=0
- spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Dbfs=0
- spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2File=0
- spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Gcs=0
- spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2S3=0
- spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Volumes=0
- spark.databricks.clusterUsageTags.numPerClusterInitScriptsV2Workspace=0
- spark.databricks.clusterUsageTags.numPerGlobalInitScriptsV2=0
- spark.databricks.clusterUsageTags.orgId=4805034236521897
- spark.databricks.clusterUsageTags.privateLinkEnabled=false
- spark.databricks.clusterUsageTags.region=australiaeast
- spark.databricks.clusterUsageTags.runtimeEngine=PHOTON
- spark.databricks.clusterUsageTags.sparkEnvVarContainsBacktick=false
- spark.databricks.clusterUsageTags.sparkEnvVarContainsDollarSign=false
- spark.databricks.clusterUsageTags.sparkEnvVarContainsDoubleQuotes=false
- spark.databricks.clusterUsageTags.sparkEnvVarContainsEscape=false
- spark.databricks.clusterUsageTags.sparkEnvVarContainsNewline=false
- spark.databricks.clusterUsageTags.sparkEnvVarContainsSingleQuotes=false
- spark.databricks.clusterUsageTags.sparkImageLabel=release__13.2.x-snapshot-photon-scala2.12__databricks-universe__13.2.5__4bfecaa__c399899__jenkins__4f6bedc__format-3
- spark.databricks.clusterUsageTags.sparkMasterUrlType=*********(redacted)
- spark.databricks.clusterUsageTags.sparkVersion=13.2.x-photon-scala2.12
- spark.databricks.clusterUsageTags.userId=*********(redacted)
- spark.databricks.clusterUsageTags.userProvidedRemoteVolumeCount=*********(redacted)
- spark.databricks.clusterUsageTags.userProvidedRemoteVolumeSizeGb=*********(redacted)
- spark.databricks.clusterUsageTags.userProvidedRemoteVolumeType=*********(redacted)
- spark.databricks.clusterUsageTags.userProvidedSparkVersion=*********(redacted)
- spark.databricks.clusterUsageTags.workerEnvironmentId=workerenv-4805034236521897
- spark.databricks.credential.aws.secretKey.redactor=*********(redacted)
- spark.databricks.credential.redactor=*********(redacted)
- spark.databricks.credential.scope.fs.adls.gen2.tokenProviderClassName=*********(redacted)
- spark.databricks.credential.scope.fs.gs.auth.access.tokenProviderClassName=*********(redacted)
- spark.databricks.credential.scope.fs.impl=*********(redacted)
- spark.databricks.credential.scope.fs.s3a.tokenProviderClassName=*********(redacted)
- spark.databricks.delta.logStore.crossCloud.fatal=true
- spark.databricks.delta.multiClusterWrites.enabled=true
- spark.databricks.delta.preview.enabled=true
- spark.databricks.driverNfs.clusterWidePythonLibsEnabled=true
- spark.databricks.driverNfs.enabled=true
- spark.databricks.driverNfs.pathSuffix=.ephemeral_nfs
- spark.databricks.driverNodeTypeId=Standard_DS3_v2
- spark.databricks.enablePublicDbfsFuse=false
- spark.databricks.eventLog.dir=eventlogs
- spark.databricks.eventLog.enabled=true
- spark.databricks.eventLog.listenerClassName=com.databricks.backend.daemon.driver.DBCEventLoggingListener
- spark.databricks.io.directoryCommit.enableLogicalDelete=false
- spark.databricks.managedCatalog.clientClassName=com.databricks.managedcatalog.ManagedCatalogClientImpl
- spark.databricks.metrics.filesystem_io_metrics=true
- spark.databricks.mlflow.autologging.enabled=true
- spark.databricks.overrideDefaultCommitProtocol=org.apache.spark.sql.execution.datasources.SQLHadoopMapReduceCommitProtocol
- spark.databricks.passthrough.adls.gen2.tokenProviderClassName=*********(redacted)
- spark.databricks.passthrough.adls.tokenProviderClassName=*********(redacted)
- spark.databricks.passthrough.enabled=true
- spark.databricks.passthrough.glue.credentialsProviderFactoryClassName=*********(redacted)
- spark.databricks.passthrough.glue.executorServiceFactoryClassName=*********(redacted)
- spark.databricks.passthrough.oauth.refresher.impl=*********(redacted)
- spark.databricks.passthrough.s3a.threadPoolExecutor.factory.class=com.databricks.backend.daemon.driver.aws.S3APassthroughThreadPoolExecutorFactory
- spark.databricks.passthrough.s3a.tokenProviderClassName=*********(redacted)
- spark.databricks.preemption.enabled=true
- spark.databricks.privateLinkEnabled=false
- spark.databricks.redactor=com.databricks.spark.util.DatabricksSparkLogRedactorProxy
- spark.databricks.repl.enableClassFileCleanup=true
- spark.databricks.secret.envVar.keys.toRedact=*********(redacted)
- spark.databricks.secret.sparkConf.keys.toRedact=*********(redacted)
- spark.databricks.service.dbutils.repl.backend=com.databricks.dbconnect.ReplDBUtils
- spark.databricks.service.dbutils.server.backend=com.databricks.dbconnect.SparkServerDBUtils
- spark.databricks.session.share=false
- spark.databricks.sparkContextId=7666683401531837394
- spark.databricks.sql.configMapperClass=com.databricks.dbsql.config.SqlConfigMapperBridge
- spark.databricks.tahoe.logStore.aws.class=com.databricks.tahoe.store.MultiClusterLogStore
- spark.databricks.tahoe.logStore.azure.class=com.databricks.tahoe.store.AzureLogStore
- spark.databricks.tahoe.logStore.class=com.databricks.tahoe.store.DelegatingLogStore
- spark.databricks.tahoe.logStore.gcp.class=com.databricks.tahoe.store.GCPLogStore
- spark.databricks.unityCatalog.credentialManager.apiTokenProviderClassName=*********(redacted)
- spark.databricks.unityCatalog.credentialManager.tokenRefreshEnabled=*********(redacted)
- spark.databricks.unityCatalog.volumes.fuse.server.enabled=true
- spark.databricks.workerNodeTypeId=Standard_DS3_v2
- spark.databricks.workspaceUrl=*********(redacted)
- spark.databricks.wsfs.workspacePrivatePreview=true
- spark.databricks.wsfsPublicPreview=true
- spark.delta.sharing.profile.provider.class=*********(redacted)
- spark.driver.allowMultipleContexts=false
- spark.driver.extraJavaOptions=-Djava.net.preferIPv6Addresses=false -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED --add-opens=java.management/sun.management=ALL-UNNAMED -Djdk.reflect.useDirectMethodHandle=false
- spark.driver.host=10.139.64.4
- spark.driver.maxResultSize=4g
- spark.driver.port=44035
- spark.driver.tempDirectory=/local_disk0/tmp
- spark.eventLog.enabled=false
- spark.executor.extraClassPath=/databricks/spark/dbconf/log4j/executor:/databricks/spark/dbconf/jets3t/:/databricks/spark/dbconf/hadoop:/databricks/hive/conf:/databricks/jars/*
- spark.executor.extraJavaOptions=-Djava.net.preferIPv6Addresses=false -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED --add-opens=java.management/sun.management=ALL-UNNAMED -Djdk.reflect.useDirectMethodHandle=false -Djava.io.tmpdir=/local_disk0/tmp -XX:ReservedCodeCacheSize=512m -XX:+UseCodeCacheFlushing -XX:PerMethodRecompilationCutoff=-1 -XX:PerBytecodeRecompilationCutoff=-1 -Djava.security.properties=/databricks/spark/dbconf/java/extra.security -XX:-UseContainerSupport -XX:+PrintFlagsFinal -XX:+PrintGCDateStamps -XX:+PrintGCDetails -verbose:gc -Xss4m -Djava.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni -Djavax.xml.datatype.DatatypeFactory=com.sun.org.apache.xerces.internal.jaxp.datatype.DatatypeFactoryImpl -Djavax.xml.parsers.DocumentBuilderFactory=com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderFactoryImpl -Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl -Djavax.xml.validation.SchemaFactory:http://www.w3.org/2001/XMLSchema=com.sun.org.apache.xerces.internal.jaxp.validation.XMLSchemaFactory -Dorg.xml.sax.driver=com.sun.org.apache.xerces.internal.parsers.SAXParser -Dorg.w3c.dom.DOMImplementationSourceList=com.sun.org.apache.xerces.internal.dom.DOMXSImplementationSourceImpl -Djavax.net.ssl.sessionCacheSize=10000 -Dscala.reflect.runtime.disable.typetag.cache=true -Dcom.google.cloud.spark.bigquery.repackaged.io.netty.tryReflectionSetAccessible=true -Dlog4j2.formatMsgNoLookups=true -verbose:gc -Xloggc:/dev/stdout -verbose:class -XX:+UnlockDiagnosticVMOptions -XX:+LogVMOutput -XX:-DisplayVMOutput -XX:LogFile=/databricks/databricks_vm_pipe -Ddatabricks.vmLog.pipe=/databricks/databricks_vm_pipe -Ddatabricks.serviceName=spark-executor-1
- spark.executor.id=driver
- spark.executor.memory=1821m
- spark.executor.tempDirectory=/local_disk0/tmp
- spark.extraListeners=com.databricks.backend.daemon.driver.DBCEventLoggingListener
- spark.files.fetchFailure.unRegisterOutputOnHost=true
- spark.files.overwrite=true
- spark.files.useFetchCache=false
- spark.hadoop.databricks.dbfs.client.version=v2
- spark.hadoop.databricks.fs.perfMetrics.enable=true
- spark.hadoop.databricks.loki.fileStatusCache.abfs.enabled=true
- spark.hadoop.databricks.loki.fileStatusCache.gcs.enabled=true
- spark.hadoop.databricks.loki.fileStatusCache.s3a.enabled=true
- spark.hadoop.databricks.loki.fileSystemCache.enabled=true
- spark.hadoop.databricks.s3.create.deleteUnnecessaryFakeDirectories=false
- spark.hadoop.databricks.s3.verifyBucketExists.enabled=false
- spark.hadoop.databricks.s3commit.client.sslTrustAll=false
- spark.hadoop.fs.AbstractFileSystem.gs.impl=shaded.databricks.com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS
- spark.hadoop.fs.abfs.impl=com.databricks.common.filesystem.LokiFileSystem
- spark.hadoop.fs.abfs.impl.disable.cache=true
- spark.hadoop.fs.abfss.impl=com.databricks.common.filesystem.LokiFileSystem
- spark.hadoop.fs.abfss.impl.disable.cache=true
- spark.hadoop.fs.adl.impl=com.databricks.common.filesystem.LokiFileSystem
- spark.hadoop.fs.adl.impl.disable.cache=true
- spark.hadoop.fs.azure.authorization.caching.enable=false
- spark.hadoop.fs.azure.cache.invalidator.type=com.databricks.encryption.utils.CacheInvalidatorImpl
- spark.hadoop.fs.azure.readaheadqueue.depth=0
- spark.hadoop.fs.azure.skip.metrics=true
- spark.hadoop.fs.azure.user.agent.prefix=*********(redacted)
- spark.hadoop.fs.cpfs-abfss.impl=*********(redacted)
- spark.hadoop.fs.cpfs-abfss.impl.disable.cache=true
- spark.hadoop.fs.cpfs-adl.impl=*********(redacted)
- spark.hadoop.fs.cpfs-adl.impl.disable.cache=true
- spark.hadoop.fs.cpfs-s3.impl=*********(redacted)
- spark.hadoop.fs.cpfs-s3a.impl=*********(redacted)
- spark.hadoop.fs.cpfs-s3n.impl=*********(redacted)
- spark.hadoop.fs.dbfs.impl=com.databricks.backend.daemon.data.client.DbfsHadoop3
- spark.hadoop.fs.dbfsartifacts.impl=com.databricks.backend.daemon.data.client.DBFSV1
- spark.hadoop.fs.fcfs-abfs.impl=*********(redacted)
- spark.hadoop.fs.fcfs-abfs.impl.disable.cache=true
- spark.hadoop.fs.fcfs-abfss.impl=*********(redacted)
- spark.hadoop.fs.fcfs-abfss.impl.disable.cache=true
- spark.hadoop.fs.fcfs-s3.impl=*********(redacted)
- spark.hadoop.fs.fcfs-s3.impl.disable.cache=true
- spark.hadoop.fs.fcfs-s3a.impl=*********(redacted)
- spark.hadoop.fs.fcfs-s3a.impl.disable.cache=true
- spark.hadoop.fs.fcfs-s3n.impl=*********(redacted)
- spark.hadoop.fs.fcfs-s3n.impl.disable.cache=true
- spark.hadoop.fs.fcfs-wasb.impl=*********(redacted)
- spark.hadoop.fs.fcfs-wasb.impl.disable.cache=true
- spark.hadoop.fs.fcfs-wasbs.impl=*********(redacted)
- spark.hadoop.fs.fcfs-wasbs.impl.disable.cache=true
- spark.hadoop.fs.file.impl=com.databricks.backend.daemon.driver.WorkspaceLocalFileSystem
- spark.hadoop.fs.gs.impl=com.databricks.common.filesystem.LokiFileSystem
- spark.hadoop.fs.gs.impl.disable.cache=true
- spark.hadoop.fs.gs.outputstream.upload.chunk.size=16777216
- spark.hadoop.fs.idbfs.impl=com.databricks.io.idbfs.IdbfsFileSystem
- spark.hadoop.fs.mlflowdbfs.impl=com.databricks.mlflowdbfs.MlflowdbfsFileSystem
- spark.hadoop.fs.s3.impl=com.databricks.common.filesystem.LokiFileSystem
- spark.hadoop.fs.s3.impl.disable.cache=true
- spark.hadoop.fs.s3a.assumed.role.credentials.provider=*********(redacted)
- spark.hadoop.fs.s3a.attempts.maximum=10
- spark.hadoop.fs.s3a.block.size=67108864
- spark.hadoop.fs.s3a.connection.maximum=200
- spark.hadoop.fs.s3a.connection.timeout=50000
- spark.hadoop.fs.s3a.fast.upload=true
- spark.hadoop.fs.s3a.fast.upload.active.blocks=32
- spark.hadoop.fs.s3a.fast.upload.default=true
- spark.hadoop.fs.s3a.impl=com.databricks.common.filesystem.LokiFileSystem
- spark.hadoop.fs.s3a.impl.disable.cache=true
- spark.hadoop.fs.s3a.max.total.tasks=1000
- spark.hadoop.fs.s3a.multipart.size=10485760
- spark.hadoop.fs.s3a.multipart.threshold=104857600
- spark.hadoop.fs.s3a.retry.interval=250ms
- spark.hadoop.fs.s3a.retry.limit=6
- spark.hadoop.fs.s3a.retry.throttle.interval=500ms
- spark.hadoop.fs.s3a.threads.max=136
- spark.hadoop.fs.s3n.impl=com.databricks.common.filesystem.LokiFileSystem
- spark.hadoop.fs.s3n.impl.disable.cache=true
- spark.hadoop.fs.stage.impl=com.databricks.backend.daemon.driver.managedcatalog.PersonalStagingFileSystem
- spark.hadoop.fs.stage.impl.disable.cache=true
- spark.hadoop.fs.wasb.impl=com.databricks.common.filesystem.LokiFileSystem
- spark.hadoop.fs.wasb.impl.disable.cache=true
- spark.hadoop.fs.wasbs.impl=com.databricks.common.filesystem.LokiFileSystem
- spark.hadoop.fs.wasbs.impl.disable.cache=true
- spark.hadoop.hive.hmshandler.retry.attempts=10
- spark.hadoop.hive.hmshandler.retry.interval=2000
- spark.hadoop.hive.server2.enable.doAs=false
- spark.hadoop.hive.server2.idle.operation.timeout=7200000
- spark.hadoop.hive.server2.idle.session.timeout=900000
- spark.hadoop.hive.server2.keystore.password=*********(redacted)
- spark.hadoop.hive.server2.keystore.path=/databricks/keys/jetty-ssl-driver-keystore.jks
- spark.hadoop.hive.server2.session.check.interval=60000
- spark.hadoop.hive.server2.thrift.http.cookie.auth.enabled=false
- spark.hadoop.hive.server2.thrift.http.port=10000
- spark.hadoop.hive.server2.transport.mode=http
- spark.hadoop.hive.server2.use.SSL=true
- spark.hadoop.hive.warehouse.subdir.inherit.perms=false
- spark.hadoop.mapred.output.committer.class=com.databricks.backend.daemon.data.client.DirectOutputCommitter
- spark.hadoop.mapreduce.fileoutputcommitter.algorithm.version=2
- spark.hadoop.parquet.abfs.readahead.optimization.enabled=true
- spark.hadoop.parquet.block.size.row.check.max=10
- spark.hadoop.parquet.block.size.row.check.min=10
- spark.hadoop.parquet.filter.columnindex.enabled=false
- spark.hadoop.parquet.memory.pool.ratio=0.5
- spark.hadoop.parquet.page.metadata.validation.enabled=true
- spark.hadoop.parquet.page.size.check.estimate=false
- spark.hadoop.parquet.page.verify-checksum.enabled=true
- spark.hadoop.parquet.page.write-checksum.enabled=true
- spark.hadoop.spark.databricks.io.parquet.verifyChecksumOnWrite.enabled=false
- spark.hadoop.spark.databricks.io.parquet.verifyChecksumOnWrite.throwsException=false
- spark.hadoop.spark.databricks.metrics.filesystem_metrics=true
- spark.hadoop.spark.driverproxy.customHeadersToProperties=*********(redacted)
- spark.hadoop.spark.hadoop.aws.glue.cache.db.size=1000
- spark.hadoop.spark.hadoop.aws.glue.cache.db.ttl-mins=30
- spark.hadoop.spark.hadoop.aws.glue.cache.table.size=1000
- spark.hadoop.spark.hadoop.aws.glue.cache.table.ttl-mins=30
- spark.hadoop.spark.sql.parquet.output.committer.class=org.apache.spark.sql.parquet.DirectParquetOutputCommitter
- spark.hadoop.spark.sql.sources.outputCommitterClass=com.databricks.backend.daemon.data.client.MapReduceDirectOutputCommitter
- spark.home=/databricks/spark
- spark.logConf=true
- spark.master=local[*, 4]
- spark.memory.offHeap.enabled=true
- spark.memory.offHeap.size=5728370688
- spark.metrics.conf=/databricks/spark/conf/metrics.properties
- spark.r.backendConnectionTimeout=604800
- spark.r.numRBackendThreads=1
- spark.rdd.compress=true
- spark.repl.class.outputDir=/local_disk0/tmp/repl/spark-7666683401531837394-987aa9b6-0e13-4dde-8745-8bb756ba9bd3
- spark.rpc.message.maxSize=256
- spark.scheduler.listenerbus.eventqueue.capacity=20000
- spark.scheduler.mode=FAIR
- spark.serializer.objectStreamReset=100
- spark.shuffle.manager=SORT
- spark.shuffle.memoryFraction=0.2
- spark.shuffle.reduceLocality.enabled=false
- spark.shuffle.service.enabled=true
- spark.shuffle.service.port=4048
- spark.sparklyr-backend.threads=1
- spark.sparkr.use.daemon=false
- spark.speculation=false
- spark.speculation.multiplier=3
- spark.speculation.quantile=0.9
- spark.sql.allowMultipleContexts=false
- spark.sql.hive.convertCTAS=true
- spark.sql.hive.convertMetastoreParquet=true
- spark.sql.hive.metastore.jars=/databricks/databricks-hive/*
- spark.sql.hive.metastore.sharedPrefixes=org.mariadb.jdbc,com.mysql.jdbc,org.postgresql,com.microsoft.sqlserver,microsoft.sql.DateTimeOffset,microsoft.sql.Types,com.databricks,com.codahale,com.fasterxml.jackson,shaded.databricks
- spark.sql.hive.metastore.version=0.13.0
- spark.sql.legacy.createHiveTableByDefault=false
- spark.sql.parquet.cacheMetadata=true
- spark.sql.parquet.compression.codec=snappy
- spark.sql.sources.commitProtocolClass=com.databricks.sql.transaction.directory.DirectoryAtomicCommitProtocol
- spark.sql.sources.default=delta
- spark.sql.streaming.checkpointFileManagerClass=com.databricks.spark.sql.streaming.DatabricksCheckpointFileManager
- spark.sql.streaming.stopTimeout=15s
- spark.sql.warehouse.dir=*********(redacted)
- spark.storage.blockManagerTimeoutIntervalMs=300000
- spark.storage.memoryFraction=0.5
- spark.streaming.driver.writeAheadLog.allowBatching=true
- spark.streaming.driver.writeAheadLog.closeFileAfterWrite=true
- spark.task.reaper.enabled=true
- spark.task.reaper.killTimeout=60s
- spark.ui.port=40001
- spark.ui.prometheus.enabled=true
- spark.worker.aioaLazyConfig.dbfsReadinessCheckClientClass=com.databricks.backend.daemon.driver.NephosDbfsReadinessCheckClient
- spark.worker.aioaLazyConfig.iamReadinessCheckClientClass=com.databricks.backend.daemon.driver.NephosIamRoleCheckClient
- spark.worker.cleanup.enabled=false
- 23/10/02 06:47:28 WARN MetricsSystem: Using default name SparkStatusTracker for source because neither spark.metrics.namespace nor spark.app.id is set.
- 23/10/02 06:47:28 INFO log: Logging initialized @17266ms to org.eclipse.jetty.util.log.Slf4jLog
- 23/10/02 06:47:28 INFO JettyUtils: Start Jetty 10.139.64.4:40001 for SparkUI
- 23/10/02 06:47:28 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
- 23/10/02 06:47:28 INFO Server: Started @17488ms
- 23/10/02 06:47:29 INFO AbstractConnector: Started ServerConnector@29b61bb{HTTP/1.1, (http/1.1)}{10.139.64.4:40001}
- 23/10/02 06:47:29 INFO Utils: Successfully started service 'SparkUI' on port 40001.
- 23/10/02 06:47:29 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@64b242b3{/,null,AVAILABLE,@Spark}
- 23/10/02 06:47:30 INFO DriverPluginContainer: Initialized driver component for plugin org.apache.spark.sql.connect.SparkConnectPlugin.
- 23/10/02 06:47:30 INFO DLTDebugger: Registered DLTDebuggerEndpoint at endpoint dlt-debugger
- 23/10/02 06:47:31 INFO DriverPluginContainer: Initialized driver component for plugin org.apache.spark.debugger.DLTDebuggerSparkPlugin.
- 23/10/02 06:47:31 INFO FairSchedulableBuilder: Fair scheduler configuration not found, created default pool: default, schedulingMode: FAIR, minShare: 0, weight: 1
- 23/10/02 06:47:31 INFO Executor: Starting executor ID driver on host 10.139.64.4
- 23/10/02 06:47:31 ERROR FuseLogAggregator: 2023/10/02 06:46:56.990553 dbr.ERROR Unable to setup backend: Volumes: &{/Volumes uc-volumes:/Volumes {CseMetaFetching: S3FsImpl: AccessKeyId: SecretAccessKey: SessionTokenId: CannedACL:{cannedAclHeader:} RequesterPays: CredentialsType: CredentialsProvider: StsAssumeRoleArn: StsAssumeRoleExternalId: StsEndpoint: AutoDetectEndpoint: Endpoint: FallbackEndpoint: SseAlgorithm: SseKey: SseCAlgorithm: Cse: CseMaterialProvider: OldAccessKeyId: OldSecretAccessKey: OldStsAssumeRoleArn: OldStsAssumeRoleExternalId: OldStsEndpoint: OldSseKey: OldSseKmsKeyId: OldSseCKey: OldCredentialsCustomClass: AdlsOauth2ClientId: AdlsOauth2RefreshUrl: AdlsOauth2Credential: AdlsOauth2TokenProvider: AdlsOauth2TokenProviderType: AdlsOauth2TokenCustomProvider: AdlsOauth2RefreshToken: AdlsFsImpl: AbfsAuthType: AbfsOauthProvider: AbfsOauth2ClientId: AbfsOauth2ClientSecret: AbfsOauth2ClientEndpoint: AbfsTokenProviderType: AbfsSasToken: GcpCredentialsType: Attributes:map[]}}: Unrecognized storage scheme: uc-volumes
- 23/10/02 06:47:31 ERROR FuseLogAggregator: 2023/10/02 06:46:56.990771 dbr.ERROR Unable to setup backend: Volume: &{/Volume dbfs-reserved-path:/uc-volumes-reserved {CseMetaFetching: S3FsImpl: AccessKeyId: SecretAccessKey: SessionTokenId: CannedACL:{cannedAclHeader:} RequesterPays: CredentialsType: CredentialsProvider: StsAssumeRoleArn: StsAssumeRoleExternalId: StsEndpoint: AutoDetectEndpoint: Endpoint: FallbackEndpoint: SseAlgorithm: SseKey: SseCAlgorithm: Cse: CseMaterialProvider: OldAccessKeyId: OldSecretAccessKey: OldStsAssumeRoleArn: OldStsAssumeRoleExternalId: OldStsEndpoint: OldSseKey: OldSseKmsKeyId: OldSseCKey: OldCredentialsCustomClass: AdlsOauth2ClientId: AdlsOauth2RefreshUrl: AdlsOauth2Credential: AdlsOauth2TokenProvider: AdlsOauth2TokenProviderType: AdlsOauth2TokenCustomProvider: AdlsOauth2RefreshToken: AdlsFsImpl: AbfsAuthType: AbfsOauthProvider: AbfsOauth2ClientId: AbfsOauth2ClientSecret: AbfsOauth2ClientEndpoint: AbfsTokenProviderType: AbfsSasToken: GcpCredentialsType: Attributes:map[]}}: Unrecognized storage scheme: dbfs-reserved-path
- 23/10/02 06:47:31 ERROR FuseLogAggregator: 2023/10/02 06:46:56.990839 dbr.ERROR Unable to setup backend: volumes: &{/volumes dbfs-reserved-path:/uc-volumes-reserved {CseMetaFetching: S3FsImpl: AccessKeyId: SecretAccessKey: SessionTokenId: CannedACL:{cannedAclHeader:} RequesterPays: CredentialsType: CredentialsProvider: StsAssumeRoleArn: StsAssumeRoleExternalId: StsEndpoint: AutoDetectEndpoint: Endpoint: FallbackEndpoint: SseAlgorithm: SseKey: SseCAlgorithm: Cse: CseMaterialProvider: OldAccessKeyId: OldSecretAccessKey: OldStsAssumeRoleArn: OldStsAssumeRoleExternalId: OldStsEndpoint: OldSseKey: OldSseKmsKeyId: OldSseCKey: OldCredentialsCustomClass: AdlsOauth2ClientId: AdlsOauth2RefreshUrl: AdlsOauth2Credential: AdlsOauth2TokenProvider: AdlsOauth2TokenProviderType: AdlsOauth2TokenCustomProvider: AdlsOauth2RefreshToken: AdlsFsImpl: AbfsAuthType: AbfsOauthProvider: AbfsOauth2ClientId: AbfsOauth2ClientSecret: AbfsOauth2ClientEndpoint: AbfsTokenProviderType: AbfsSasToken: GcpCredentialsType: Attributes:map[]}}: Unrecognized storage scheme: dbfs-reserved-path
- 23/10/02 06:47:31 ERROR FuseLogAggregator: 2023/10/02 06:46:56.990899 dbr.ERROR Unable to setup backend: volume: &{/volume dbfs-reserved-path:/uc-volumes-reserved {CseMetaFetching: S3FsImpl: AccessKeyId: SecretAccessKey: SessionTokenId: CannedACL:{cannedAclHeader:} RequesterPays: CredentialsType: CredentialsProvider: StsAssumeRoleArn: StsAssumeRoleExternalId: StsEndpoint: AutoDetectEndpoint: Endpoint: FallbackEndpoint: SseAlgorithm: SseKey: SseCAlgorithm: Cse: CseMaterialProvider: OldAccessKeyId: OldSecretAccessKey: OldStsAssumeRoleArn: OldStsAssumeRoleExternalId: OldStsEndpoint: OldSseKey: OldSseKmsKeyId: OldSseCKey: OldCredentialsCustomClass: AdlsOauth2ClientId: AdlsOauth2RefreshUrl: AdlsOauth2Credential: AdlsOauth2TokenProvider: AdlsOauth2TokenProviderType: AdlsOauth2TokenCustomProvider: AdlsOauth2RefreshToken: AdlsFsImpl: AbfsAuthType: AbfsOauthProvider: AbfsOauth2ClientId: AbfsOauth2ClientSecret: AbfsOauth2ClientEndpoint: AbfsTokenProviderType: AbfsSasToken: GcpCredentialsType: Attributes:map[]}}: Unrecognized storage scheme: dbfs-reserved-path
- 23/10/02 06:47:31 INFO Executor: Starting executor with user classpath (userClassPathFirst = false): 'file:/databricks/spark/dbconf/log4j/executor/,file:/databricks/spark/dbconf/jets3t/,file:/databricks/spark/dbconf/hadoop/,file:/databricks/hive/conf/,file:/databricks/jars/*,file:/databricks/driver/conf/,file:/databricks/driver/hadoop,file:/databricks/driver/executor,file:/databricks/driver/*,file:/databricks/driver/jets3t'
- 23/10/02 06:47:31 INFO Executor: Using REPL class URI: spark://10.139.64.4:44035/classes
- 23/10/02 06:47:31 INFO ExecutorPluginContainer: Initialized executor component for plugin org.apache.spark.debugger.DLTDebuggerSparkPlugin.
- 23/10/02 06:47:31 INFO Utils: resolved command to be run: WrappedArray(getconf, PAGESIZE)
- 23/10/02 06:47:31 INFO TaskSchedulerImpl: Task preemption enabled.
- 23/10/02 06:47:31 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41225.
- 23/10/02 06:47:31 INFO NettyBlockTransferService: Server created on 10.139.64.4:41225
- 23/10/02 06:47:31 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
- 23/10/02 06:47:31 INFO BlockManager: external shuffle service port = 4048
- 23/10/02 06:47:31 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.139.64.4, 41225, None)
- 23/10/02 06:47:31 INFO BlockManagerMasterEndpoint: Registering block manager 10.139.64.4:41225 with 8.7 GiB RAM, BlockManagerId(driver, 10.139.64.4, 41225, None)
- 23/10/02 06:47:31 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.139.64.4, 41225, None)
- 23/10/02 06:47:31 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.139.64.4, 41225, None)
- 23/10/02 06:47:31 INFO DBCEventLoggingListener: Initializing DBCEventLoggingListener
- 23/10/02 06:47:31 INFO DBCEventLoggingListener: Logging events to eventlogs/7666683401531837394/eventlog
- 23/10/02 06:47:31 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
- 23/10/02 06:47:31 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@24b2b5d3{/,null,AVAILABLE}
- 23/10/02 06:47:31 INFO SslContextFactory: x509=X509@7a8b7f2e(1,h=[az-australiaeast.workers.prod.ns.databricks.com],a=[],w=[]) for Server@74d7ebce[provider=null,keyStore=file:///databricks/keys/jetty_ssl_driver_keystore.jks,trustStore=file:///databricks/keys/jetty_ssl_driver_keystore.jks]
- 23/10/02 06:47:31 INFO AbstractConnector: Started ServerConnector@34e1a87b{SSL, (ssl, http/1.1)}{0.0.0.0:1023}
- 23/10/02 06:47:31 INFO Server: Started @20341ms
- 23/10/02 06:47:31 INFO FuseDaemonServer: FuseDaemonServer started on 1023 with endpoint: '/get-unity-token'.
- 23/10/02 06:47:31 INFO SparkContext: Registered listener com.databricks.backend.daemon.driver.DBCEventLoggingListener
- 23/10/02 06:47:32 INFO DatabricksILoop$: Finished creating throwaway interpreter
- 23/10/02 06:47:32 INFO ContextHandler: Stopped o.e.j.s.ServletContextHandler@64b242b3{/,null,STOPPED,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@29617475{/jobs,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@353b5d86{/jobs/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@43eea3bd{/jobs/job,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7f7eeaaf{/jobs/job/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@38499139{/stages,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2ca93dee{/stages/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2265a052{/stages/stage,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@72090715{/stages/stage/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@fcdeb50{/stages/pool,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@65f1bf2c{/stages/pool/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@286f8e90{/storage,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4ea8832c{/storage/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@11a9f958{/storage/rdd,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7fbe0f64{/storage/rdd/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@53b42a0d{/environment,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@33c5d3e{/environment/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@376b6d7d{/executors,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6f272dc4{/executors/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@28817763{/executors/threadDump,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6abdead7{/executors/threadDump/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6232cde5{/executors/heapHistogram,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4e9b66ab{/executors/heapHistogram/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@27828021{/static,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4daa53e0{/,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@42748f59{/api,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@86e7c7a{/metrics,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@22edca96{/jobs/job/kill,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6e1bd2b{/stages/stage/kill,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@13f6395d{/metrics/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:32 INFO SparkContext: Loading Spark Service RPC Server. Classloader stack:List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@33d08a24, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@7906de2f, sun.misc.Launcher$AppClassLoader@3d299e3, sun.misc.Launcher$ExtClassLoader@b672aa8)
- 23/10/02 06:47:33 INFO SparkServiceRPCServer: Initializing Spark Service RPC Server. Classloader stack: List(com.databricks.backend.daemon.driver.ClassLoaders$MultiReplClassLoader@33d08a24, com.databricks.backend.daemon.driver.ClassLoaders$LibraryClassLoader@7906de2f, sun.misc.Launcher$AppClassLoader@3d299e3, sun.misc.Launcher$ExtClassLoader@b672aa8)
- 23/10/02 06:47:33 INFO SparkServiceRPCServer: Spark Service RPC Server is disabled.
- 23/10/02 06:47:33 INFO DatabricksILoop$: Successfully registered spark metrics in Prometheus registry
- 23/10/02 06:47:33 INFO DatabricksILoop$: Successfully initialized SparkContext
- 23/10/02 06:47:33 INFO SharedState: Scheduler stats enabled.
- 23/10/02 06:47:33 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir.
- 23/10/02 06:47:33 INFO SharedState: Warehouse path is 'dbfs:/user/hive/warehouse'.
- 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@295d1a45{/storage/iocache,null,AVAILABLE,@Spark}
- 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4b45016d{/storage/iocache/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@134ccb03{/SQL,null,AVAILABLE,@Spark}
- 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@12e25b4b{/SQL/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@139e291e{/SQL/execution,null,AVAILABLE,@Spark}
- 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2d58136{/SQL/execution/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:33 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@6b8de9dc{/static/sql,null,AVAILABLE,@Spark}
- 23/10/02 06:47:33 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
- 23/10/02 06:47:33 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
- 23/10/02 06:47:37 INFO DriverConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:37 INFO DriverConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:37 WARN DriverConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
- 23/10/02 06:47:39 INFO DatabricksMountsStore: Mount store initialization: Attempting to get the list of mounts from metadata manager of DBFS
- 23/10/02 06:47:39 INFO log: Logging initialized @28259ms to shaded.v9_4.org.eclipse.jetty.util.log.Slf4jLog
- 23/10/02 06:47:39 INFO DynamicRpcConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:39 INFO DynamicRpcConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:39 WARN DynamicRpcConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
- 23/10/02 06:47:40 INFO TypeUtil: JVM Runtime does not support Modules
- 23/10/02 06:47:40 INFO DatabricksMountsStore: Mount store initialization: Received a list of 9 mounts accessible from metadata manager of DBFS
- 23/10/02 06:47:40 INFO DatabricksMountsStore: Updated mounts cache. Changes: List((+,DbfsMountPoint(s3a://databricks-datasets-sydney/, /databricks-datasets)), (+,DbfsMountPoint(uc-volumes:/Volumes, /Volumes)), (+,DbfsMountPoint(unsupported-access-mechanism-for-path--use-mlflow-client:/, /databricks/mlflow-tracking)), (+,DbfsMountPoint(abfss://dbstorageelrh4j5j7bx2k.dfs.core.windows.net/4805034236521897, /databricks-results)), (+,DbfsMountPoint(unsupported-access-mechanism-for-path--use-mlflow-client:/, /databricks/mlflow-registry)), (+,DbfsMountPoint(dbfs-reserved-path:/uc-volumes-reserved, /Volume)), (+,DbfsMountPoint(dbfs-reserved-path:/uc-volumes-reserved, /volumes)), (+,DbfsMountPoint(abfss://dbstorageelrh4j5j7bx2k.dfs.core.windows.net/4805034236521897, /)), (+,DbfsMountPoint(dbfs-reserved-path:/uc-volumes-reserved, /volume)))
- 23/10/02 06:47:41 INFO DatabricksFileSystemV2Factory: Creating abfss file system for abfss://[email protected]
- 23/10/02 06:47:42 INFO AzureBlobFileSystem:V3: Initializing AzureBlobFileSystem for abfss://[email protected]/4805034236521897 with credential = FixedSASTokenProvider with jvmId = 491
- 23/10/02 06:47:42 INFO DbfsHadoop3: Initialized DBFS with DBFSV2 as the delegate.
- 23/10/02 06:47:42 INFO HiveConf: Found configuration file file:/databricks/hive/conf/hive-site.xml
- 23/10/02 06:47:42 INFO SessionManager: HiveServer2: Background operation thread pool size: 100
- 23/10/02 06:47:42 INFO SessionManager: HiveServer2: Background operation thread wait queue size: 100
- 23/10/02 06:47:42 INFO SessionManager: HiveServer2: Background operation thread keepalive time: 10 seconds
- 23/10/02 06:47:42 INFO AbstractService: Service:OperationManager is inited.
- 23/10/02 06:47:42 INFO AbstractService: Service:SessionManager is inited.
- 23/10/02 06:47:42 INFO SparkSQLCLIService: Service: CLIService is inited.
- 23/10/02 06:47:42 INFO AbstractService: Service:ThriftHttpCLIService is inited.
- 23/10/02 06:47:42 INFO HiveThriftServer2: Service: HiveServer2 is inited.
- 23/10/02 06:47:42 INFO AbstractService: Service:OperationManager is started.
- 23/10/02 06:47:42 INFO AbstractService: Service:SessionManager is started.
- 23/10/02 06:47:42 INFO SparkSQLCLIService: Service: CLIService is started.
- 23/10/02 06:47:42 INFO AbstractService: Service:ThriftHttpCLIService is started.
- 23/10/02 06:47:42 INFO ThriftCLIService: HTTP Server SSL: adding excluded protocols: [SSLv2, SSLv3]
- 23/10/02 06:47:42 INFO ThriftCLIService: HTTP Server SSL: SslContextFactory.getExcludeProtocols = [SSL, SSLv2, SSLv2Hello, SSLv3]
- 23/10/02 06:47:42 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
- 23/10/02 06:47:42 INFO session: DefaultSessionIdManager workerName=node0
- 23/10/02 06:47:42 INFO session: No SessionScavenger set, using defaults
- 23/10/02 06:47:43 INFO session: node0 Scavenging every 600000ms
- 23/10/02 06:47:43 WARN SecurityHandler: [email protected]@2e5f2387{/,null,STARTING} has uncovered http methods for path: /*
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2e5f2387{/,null,AVAILABLE}
- 23/10/02 06:47:43 INFO SslContextFactory: x509=X509@747d4d9a(1,h=[az-australiaeast.workers.prod.ns.databricks.com],a=[],w=[]) for Server@2982f1eb[provider=null,keyStore=file:///databricks/keys/jetty-ssl-driver-keystore.jks,trustStore=null]
- 23/10/02 06:47:43 INFO AbstractConnector: Started ServerConnector@50a4e4e{SSL, (ssl, http/1.1)}{0.0.0.0:10000}
- 23/10/02 06:47:43 INFO Server: Started @31573ms
- 23/10/02 06:47:43 INFO ThriftCLIService: Started ThriftHttpCLIService in https mode on port 10000 path=/cliservice/* with 5...500 worker threads
- 23/10/02 06:47:43 INFO AbstractService: Service:HiveServer2 is started.
- 23/10/02 06:47:43 INFO HiveThriftServer2: HiveThriftServer2 started
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@2324100b{/sqlserver,null,AVAILABLE,@Spark}
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@32c77e0d{/sqlserver/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@399b9537{/sqlserver/session,null,AVAILABLE,@Spark}
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7d4f7aab{/sqlserver/session/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:43 INFO LibraryResolutionManager: Preferred maven central mirror is configured to https://maven-central.storage-download.googleapis.com/maven2/
- 23/10/02 06:47:43 WARN OutgoingDirectNotebookBufferRateLimiter$: No value specified for db-outgoing-buffer-throttler-burst. Using default: 100000000000
- 23/10/02 06:47:43 WARN OutgoingDirectNotebookBufferRateLimiter$: No value specified for db-outgoing-buffer-throttler-steady-rate. Using default: 6000000000
- 23/10/02 06:47:43 WARN OutgoingDirectNotebookBufferRateLimiter$: No value specified for db-outgoing-buffer-throttler-warning-interval-sec. Using default: 60
- 23/10/02 06:47:43 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7c9745ae{/StreamingQuery,null,AVAILABLE,@Spark}
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@5b481d77{/StreamingQuery/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@42acca1a{/StreamingQuery/statistics,null,AVAILABLE,@Spark}
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@4a26a54b{/StreamingQuery/statistics/json,null,AVAILABLE,@Spark}
- 23/10/02 06:47:43 INFO ContextHandler: Started o.e.j.s.ServletContextHandler@7d151d89{/static/sql,null,AVAILABLE,@Spark}
- 23/10/02 06:47:43 INFO JettyServer$: Creating thread pool with name ...
- 23/10/02 06:47:43 INFO JettyServer$: Thread pool created
- 23/10/02 06:47:43 INFO JettyServer$: Creating thread pool with name ...
- 23/10/02 06:47:43 INFO JettyServer$: Thread pool created
- 23/10/02 06:47:43 INFO DriverDaemon: Starting driver daemon...
- 23/10/02 06:47:43 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.password, old value Some(), new value [REDACTED], new value will take effect.
- 23/10/02 06:47:43 WARN SparkConfUtils$: Setting the same key twice for spark.databricks.io.directoryCommit.enableLogicalDelete, old value Some(false), new value false, new value will take effect.
- 23/10/02 06:47:43 WARN SparkConfUtils$: Setting the same key twice for spark.hadoop.hive.server2.keystore.path, old value Some(/databricks/keys/jetty_ssl_driver_keystore.jks), new value /databricks/keys/jetty-ssl-driver-keystore.jks, new value will take effect.
- 23/10/02 06:47:43 INFO SparkConfUtils$: Customize spark config according to file /tmp/custom-spark.conf
- 23/10/02 06:47:43 INFO DriverDaemon$: Attempting to run: 'set up ttyd daemon'
- 23/10/02 06:47:43 INFO DriverDaemon$: Attempting to run: 'Configuring RStudio daemon'
- 23/10/02 06:47:43 INFO DriverDaemon$: Resetting the default python executable
- 23/10/02 06:47:43 INFO Utils: resolved command to be run: List(virtualenv, /local_disk0/.ephemeral_nfs/cluster_libraries/python, -p, /databricks/python/bin/python, --no-download, --no-setuptools, --no-wheel)
- 23/10/02 06:47:44 INFO PythonEnvCloneHelper$: Created python virtualenv: /local_disk0/.ephemeral_nfs/cluster_libraries/python
- 23/10/02 06:47:44 INFO Utils: resolved command to be run: List(/databricks/python/bin/python, -c, import sys; dirs=[p for p in sys.path if 'package' in p]; print(' '.join(dirs)))
- 23/10/02 06:47:45 INFO Utils: resolved command to be run: List(/local_disk0/.ephemeral_nfs/cluster_libraries/python/bin/python, -c, from sysconfig import get_path; print(get_path('purelib')))
- 23/10/02 06:47:45 INFO PythonEnvCloneHelper$: Created sites.pth at /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.10/site-packages/sites.pth
- 23/10/02 06:47:45 INFO ClusterWidePythonEnvManager: Registered /local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.10/site-packages with the WatchService sun.nio.fs.LinuxWatchService$LinuxWatchKey@77d68b94
- 23/10/02 06:47:45 INFO DriverDaemon$: Attempting to run: 'Update root virtualenv'
- 23/10/02 06:47:45 INFO DriverDaemon$: Finished updating /etc/environment
- 23/10/02 06:47:45 INFO DriverDaemon$$anon$1: Message out thread ready
- 23/10/02 06:47:45 INFO NetstatUtil$: Running netstat -lnpt
- 23/10/02 06:47:45 INFO NetstatUtil$: netstat -lnpt
- Active Internet connections (only servers)
- Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name
- tcp 0 0 0.0.0.0:7681 0.0.0.0:* LISTEN 678/ttyd
- tcp 0 0 127.0.0.53:53 0.0.0.0:* LISTEN 55/systemd-resolved
- tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN 71/sshd: /usr/sbin/
- tcp6 0 0 10.139.64.4:44035 :::* LISTEN 491/java
- tcp6 0 0 :::6060 :::* LISTEN 339/java
- tcp6 0 0 :::15002 :::* LISTEN 491/java
- tcp6 0 0 :::7071 :::* LISTEN 339/java
- tcp6 0 0 10.139.64.4:41225 :::* LISTEN 491/java
- tcp6 0 0 10.139.64.4:40001 :::* LISTEN 491/java
- tcp6 0 0 :::1017 :::* LISTEN 155/wsfs
- tcp6 0 0 :::1021 :::* LISTEN 155/wsfs
- tcp6 0 0 :::1023 :::* LISTEN 491/java
- tcp6 0 0 :::1015 :::* LISTEN 176/goofys-dbr
- tcp6 0 0 :::22 :::* LISTEN 71/sshd: /usr/sbin/
- tcp6 0 0 :::10000 :::* LISTEN 491/java
- 23/10/02 06:47:45 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
- 23/10/02 06:47:45 INFO AbstractConnector: Started ServerConnector@48c2391{HTTP/1.1, (http/1.1)}{0.0.0.0:6061}
- 23/10/02 06:47:45 INFO Server: Started @34088ms
- 23/10/02 06:47:45 INFO Server: jetty-9.4.50.v20221201; built: 2022-12-01T22:07:03.915Z; git: da9a0b30691a45daf90a9f17b5defa2f1434f882; jvm 1.8.0_372-b07
- 23/10/02 06:47:45 INFO SslContextFactory: x509=X509@688d3e2a(1,h=[az-australiaeast.workers.prod.ns.databricks.com],a=[],w=[]) for Server@4fc2e703[provider=null,keyStore=null,trustStore=null]
- 23/10/02 06:47:45 WARN config: Weak cipher suite TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA enabled for Server@4fc2e703[provider=null,keyStore=null,trustStore=null]
- 23/10/02 06:47:45 WARN config: Weak cipher suite TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA enabled for Server@4fc2e703[provider=null,keyStore=null,trustStore=null]
- 23/10/02 06:47:45 INFO AbstractConnector: Started ServerConnector@2cb34e1e{SSL, (ssl, http/1.1)}{0.0.0.0:6062}
- 23/10/02 06:47:45 INFO Server: Started @34181ms
- 23/10/02 06:47:45 INFO DriverDaemon: Started comm channel server
- 23/10/02 06:47:45 INFO DriverDaemon: Driver daemon started.
- 23/10/02 06:47:45 INFO DynamicInfoServiceConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:45 INFO DynamicInfoServiceConf: Configured feature flag data source LaunchDarkly
- 23/10/02 06:47:45 WARN DynamicInfoServiceConf: REGION environment variable is not defined. getConfForCurrentRegion will always return default value
- 23/10/02 06:47:47 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
- 23/10/02 06:47:47 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
- 23/10/02 06:47:47 INFO DriverCorral: Loading the root classloader
- 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-a20c4-6dc3c-e17de
- 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-61ca1-d7420-ce2cf-b
- 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-38b1a-f400f-5b2ff-3
- 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-7ec5d-8a797-40808-0
- 23/10/02 06:47:47 INFO DriverCorral: Starting sql repl ReplId-1f405-90a42-fa0c2-4
- 23/10/02 06:47:47 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
- 23/10/02 06:47:47 WARN SQLConf: The SQL config 'spark.sql.hive.convertCTAS' has been deprecated in Spark v3.1 and may be removed in the future. Set 'spark.sql.legacy.createHiveTableByDefault' to false instead.
- 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-38b1a-f400f-5b2ff-3: finished to load
- 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-7ec5d-8a797-40808-0: finished to load
- 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-1f405-90a42-fa0c2-4: finished to load
- 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-61ca1-d7420-ce2cf-b: finished to load
- 23/10/02 06:47:47 INFO SQLDriverWrapper: setupRepl:ReplId-a20c4-6dc3c-e17de: finished to load
- 23/10/02 06:47:47 INFO DriverCorral: Starting r repl ReplId-62225-c3991-b74c2-9
- 23/10/02 06:47:47 INFO ROutputStreamHandler: Connection succeeded on port 34891
- 23/10/02 06:47:47 INFO ROutputStreamHandler: Connection succeeded on port 33679
- 23/10/02 06:47:47 INFO RDriverLocal: 1. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: object created with for ReplId-62225-c3991-b74c2-9.
- 23/10/02 06:47:47 INFO RDriverLocal: 2. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: initializing ...
- 23/10/02 06:47:47 INFO RDriverLocal: 3. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: started RBackend thread on port 36019
- 23/10/02 06:47:47 INFO RDriverLocal: 4. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: waiting for SparkR to be installed ...
- 23/10/02 06:47:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:47:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:47:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:47:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:48:00 INFO RDriverLocal$: SparkR installation completed.
- 23/10/02 06:48:00 INFO RDriverLocal: 5. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: launching R process ...
- 23/10/02 06:48:00 INFO RDriverLocal: 6. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: cgroup isolation disabled, not placing R process in REPL cgroup.
- 23/10/02 06:48:00 INFO RDriverLocal: 7. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: starting R process on port 1100 (attempt 1) ...
- 23/10/02 06:48:00 INFO RDriverLocal$: Debugging command for R process builder: SIMBASPARKINI=/etc/simba.sparkodbc.ini R_LIBS=/local_disk0/.ephemeral_nfs/envs/rEnv-f23941ad-7dc9-49bd-86b0-f366cff935f3:/databricks/spark/R/lib:/local_disk0/.ephemeral_nfs/cluster_libraries/r LD_LIBRARY_PATH=/opt/simba/sparkodbc/lib/64/ SPARKR_BACKEND_CONNECTION_TIMEOUT=604800 DB_STREAM_BEACON_STRING_START=DATABRICKS_STREAM_START-ReplId-62225-c3991-b74c2-9 DB_STDOUT_STREAM_PORT=34891 SPARKR_BACKEND_AUTH_SECRET=a2319d90d7b0c12aab2a4bd2d5753351f0c109dbdb5a35c39c2c236115dda062 DB_STREAM_BEACON_STRING_END=DATABRICKS_STREAM_END-ReplId-62225-c3991-b74c2-9 EXISTING_SPARKR_BACKEND_PORT=36019 ODBCINI=/etc/odbc.ini DB_STDERR_STREAM_PORT=33679 /bin/bash /local_disk0/tmp/_startR.sh1893372858473036520resource.r /local_disk0/tmp/_rServeScript.r7342418338649097687resource.r 1100 None
- 23/10/02 06:48:00 INFO RDriverLocal: 8. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: setting up BufferedStreamThread with bufferSize: 1000.
- 23/10/02 06:48:02 INFO RDriverLocal: 9. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: R process started with RServe listening on port 1100.
- 23/10/02 06:48:02 INFO RDriverLocal: 10. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: starting interpreter to talk to R process ...
- 23/10/02 06:48:03 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
- 23/10/02 06:48:03 INFO ROutputStreamHandler: Successfully connected to stdout in the RShell.
- 23/10/02 06:48:03 INFO ROutputStreamHandler: Successfully connected to stderr in the RShell.
- 23/10/02 06:48:03 INFO RDriverLocal: 11. RDriverLocal.9cdadbb3-d299-42ba-9d5b-05d6458230cb: R interpreter is connected.
- 23/10/02 06:48:03 INFO RDriverWrapper: setupRepl:ReplId-62225-c3991-b74c2-9: finished to load
- 23/10/02 06:48:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:48:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:48:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:48:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:48:25 INFO EventLoggingStats: UsageLog|1696229220000|1696229250000|2104|3|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
- 23/10/02 06:48:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:48:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:48:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:48:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:48:34 WARN DriverDaemon: ShouldUseAutoscalingInfo exception thrown, not logging stack trace. This is used for control flow and is ok to ignore
- 23/10/02 06:48:43 ERROR CommandLineHelper$: Command [REDACTED] failed with exit code 1 out: err:
- java.lang.RuntimeException: CommandLineHelper exception - stack trace
- at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:523)
- at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:418)
- at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$new$6(DriverCorral.scala:389)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
- at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:666)
- at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:684)
- at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
- at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
- at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
- at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
- at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionTags(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:661)
- at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
- at com.databricks.threading.NamedTimer$$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
- at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:540)
- at com.databricks.threading.NamedTimer$$anon$1.recordOperation(NamedTimer.scala:95)
- at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$2(NamedTimer.scala:104)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
- at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1420)
- at com.databricks.logging.UsageLogging.disableTracing$(UsageLogging.scala:1419)
- at com.databricks.threading.NamedTimer$$anon$1.disableTracing(NamedTimer.scala:95)
- at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$1(NamedTimer.scala:103)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
- at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
- at java.util.TimerThread.mainLoop(Timer.java:555)
- at java.util.TimerThread.run(Timer.java:505)
- 23/10/02 06:48:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:48:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:48:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:48:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:48:52 WARN DriverDaemon: ShouldUseAutoscalingInfo exception thrown, not logging stack trace. This is used for control flow and is ok to ignore
- 23/10/02 06:48:55 INFO EventLoggingStats: UsageLog|1696229250000|1696229280000|184882|24|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
- 23/10/02 06:49:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:49:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:49:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:49:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:49:25 INFO EventLoggingStats: UsageLog|1696229280000|1696229310000|64928|15|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
- 23/10/02 06:49:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:49:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:49:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:49:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:49:43 ERROR CommandLineHelper$: Command [REDACTED] failed with exit code 1 out: err:
- java.lang.RuntimeException: CommandLineHelper exception - stack trace
- at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:523)
- at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:418)
- at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$new$6(DriverCorral.scala:389)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
- at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:666)
- at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:684)
- at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
- at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
- at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
- at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
- at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionTags(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:661)
- at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
- at com.databricks.threading.NamedTimer$$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
- at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:540)
- at com.databricks.threading.NamedTimer$$anon$1.recordOperation(NamedTimer.scala:95)
- at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$2(NamedTimer.scala:104)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
- at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1420)
- at com.databricks.logging.UsageLogging.disableTracing$(UsageLogging.scala:1419)
- at com.databricks.threading.NamedTimer$$anon$1.disableTracing(NamedTimer.scala:95)
- at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$1(NamedTimer.scala:103)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
- at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
- at java.util.TimerThread.mainLoop(Timer.java:555)
- at java.util.TimerThread.run(Timer.java:505)
- 23/10/02 06:49:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:49:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:49:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:49:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:49:55 INFO EventLoggingStats: UsageLog|1696229310000|1696229340000|76526|17|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
- 23/10/02 06:50:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:50:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:50:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:50:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:50:25 INFO EventLoggingStats: UsageLog|1696229340000|1696229370000|64928|15|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
- 23/10/02 06:50:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:50:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:50:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:50:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:50:43 ERROR CommandLineHelper$: Command [REDACTED] failed with exit code 1 out: err:
- java.lang.RuntimeException: CommandLineHelper exception - stack trace
- at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:523)
- at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:418)
- at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$new$6(DriverCorral.scala:389)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
- at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:666)
- at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:684)
- at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
- at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
- at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
- at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
- at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionTags(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:661)
- at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
- at com.databricks.threading.NamedTimer$$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
- at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:540)
- at com.databricks.threading.NamedTimer$$anon$1.recordOperation(NamedTimer.scala:95)
- at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$2(NamedTimer.scala:104)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
- at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1420)
- at com.databricks.logging.UsageLogging.disableTracing$(UsageLogging.scala:1419)
- at com.databricks.threading.NamedTimer$$anon$1.disableTracing(NamedTimer.scala:95)
- at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$1(NamedTimer.scala:103)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
- at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
- at java.util.TimerThread.mainLoop(Timer.java:555)
- at java.util.TimerThread.run(Timer.java:505)
- 23/10/02 06:50:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:50:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:50:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:50:51 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:50:55 INFO EventLoggingStats: UsageLog|1696229370000|1696229400000|64928|15|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
- 23/10/02 06:51:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:51:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:51:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:51:11 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:51:25 INFO EventLoggingStats: UsageLog|1696229400000|1696229430000|64932|15|driver|instance-manager_2023-09-16_00.06.15Z_clusters-branch_2023-09-13_bf188a60_17b6feaf_1030641863
- 23/10/02 06:51:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:51:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:51:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, df /local_disk0)
- 23/10/02 06:51:31 INFO Utils: resolved command to be run: WrappedArray(/bin/sh, -c, cat /proc/meminfo)
- 23/10/02 06:51:43 ERROR CommandLineHelper$: Command [REDACTED] failed with exit code 1 out: err:
- java.lang.RuntimeException: CommandLineHelper exception - stack trace
- at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:523)
- at com.databricks.backend.common.util.CommandLineHelper$.runJavaProcessBuilder(CommandLineHelper.scala:418)
- at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$new$6(DriverCorral.scala:389)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
- at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:666)
- at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:684)
- at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
- at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
- at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
- at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
- at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionTags(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:661)
- at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
- at com.databricks.threading.NamedTimer$$anon$1.recordOperationWithResultTags(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
- at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:540)
- at com.databricks.threading.NamedTimer$$anon$1.recordOperation(NamedTimer.scala:95)
- at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$2(NamedTimer.scala:104)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:420)
- at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
- at com.databricks.threading.NamedTimer$$anon$1.withAttributionContext(NamedTimer.scala:95)
- at com.databricks.logging.UsageLogging.disableTracing(UsageLogging.scala:1420)
- at com.databricks.logging.UsageLogging.disableTracing$(UsageLogging.scala:1419)
- at com.databricks.threading.NamedTimer$$anon$1.disableTracing(NamedTimer.scala:95)
- at com.databricks.threading.NamedTimer$$anon$1.$anonfun$run$1(NamedTimer.scala:103)
- at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
- at com.databricks.util.UntrustedUtils$.tryLog(UntrustedUtils.scala:109)
- at com.databricks.threading.NamedTimer$$anon$1.run(NamedTimer.scala:102)
- at java.util.TimerThread.mainLoop(Timer.java:555)
- at java.util.TimerThread.run(Timer.java:505)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement