Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- 2023-08-20T12:00:31,469 INFO [main] conf.HiveConf: Found configuration file file:/home/infernus/hive-2.3.9/conf/hive-site.xml
- 2023-08-20T12:00:32,339 WARN [main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
- 2023-08-20T12:00:32,342 WARN [main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
- 2023-08-20T12:00:32,559 WARN [main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
- 2023-08-20T12:00:32,560 WARN [main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
- 2023-08-20T12:00:32,675 INFO [main] SessionState:
- Logging initialized using configuration in file:/home/infernus/hive-2.3.9/conf/hive-log4j2.properties Async: true
- 2023-08-20T12:00:32,836 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 2023-08-20T12:00:34,855 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a
- 2023-08-20T12:00:34,920 INFO [main] session.SessionState: Created local directory: /tmp/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a
- 2023-08-20T12:00:34,936 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a/_tmp_space.db
- 2023-08-20T12:00:34,967 INFO [main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
- 2023-08-20T12:00:34,970 INFO [main] session.SessionState: Updating thread name to f0fd81a8-0d73-43d0-814e-bda0253c132a main
- 2023-08-20T12:00:38,948 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
- 2023-08-20T12:00:39,026 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Compiling command(queryId=infernus_20230820120038_4eccc5c8-33bd-496c-b0c2-562457c90a4c): use stock
- 2023-08-20T12:00:39,914 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
- 2023-08-20T12:00:39,915 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
- 2023-08-20T12:00:39,916 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
- 2023-08-20T12:00:39,995 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: ObjectStore, initialize called
- 2023-08-20T12:00:42,539 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
- 2023-08-20T12:00:42,542 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
- 2023-08-20T12:00:42,546 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
- 2023-08-20T12:00:46,108 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is POSTGRES
- 2023-08-20T12:00:46,115 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: Initialized ObjectStore
- 2023-08-20T12:00:46,348 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: Added admin role in metastore
- 2023-08-20T12:00:46,359 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: Added public role in metastore
- 2023-08-20T12:00:46,394 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: No user is added in admin role, since config is empty
- 2023-08-20T12:00:46,570 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_all_functions
- 2023-08-20T12:00:46,571 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_all_functions
- 2023-08-20T12:00:46,606 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_database: stock
- 2023-08-20T12:00:46,607 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_database: stock
- 2023-08-20T12:00:46,641 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Semantic Analysis Completed
- 2023-08-20T12:00:46,645 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
- 2023-08-20T12:00:46,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Completed compiling command(queryId=infernus_20230820120038_4eccc5c8-33bd-496c-b0c2-562457c90a4c); Time taken: 7.671 seconds
- 2023-08-20T12:00:46,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
- 2023-08-20T12:00:46,664 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Executing command(queryId=infernus_20230820120038_4eccc5c8-33bd-496c-b0c2-562457c90a4c): use stock
- 2023-08-20T12:00:46,732 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] sqlstd.SQLStdHiveAccessController: Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=f0fd81a8-0d73-43d0-814e-bda0253c132a, clientType=HIVECLI]
- 2023-08-20T12:00:46,737 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] session.SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
- 2023-08-20T12:00:46,737 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] hive.metastore: Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
- 2023-08-20T12:00:46,740 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
- 2023-08-20T12:00:46,740 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=Cleaning up thread local RawStore...
- 2023-08-20T12:00:46,741 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
- 2023-08-20T12:00:46,741 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=Done cleaning up thread local RawStore
- 2023-08-20T12:00:46,766 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Starting task [Stage-0:DDL] in serial mode
- 2023-08-20T12:00:46,768 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_database: stock
- 2023-08-20T12:00:46,768 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_database: stock
- 2023-08-20T12:00:46,806 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
- 2023-08-20T12:00:46,807 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
- 2023-08-20T12:00:46,808 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
- 2023-08-20T12:00:46,810 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: ObjectStore, initialize called
- 2023-08-20T12:00:46,831 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is POSTGRES
- 2023-08-20T12:00:46,831 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: Initialized ObjectStore
- 2023-08-20T12:00:46,837 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_database: stock
- 2023-08-20T12:00:46,838 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_database: stock
- 2023-08-20T12:00:46,846 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Completed executing command(queryId=infernus_20230820120038_4eccc5c8-33bd-496c-b0c2-562457c90a4c); Time taken: 0.182 seconds
- 2023-08-20T12:00:46,848 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: OK
- 2023-08-20T12:00:46,852 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
- 2023-08-20T12:00:46,852 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] CliDriver: Time taken: 7.87 seconds
- 2023-08-20T12:00:46,852 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] session.SessionState: Resetting thread name to main
- 2023-08-20T12:00:48,542 INFO [main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
- 2023-08-20T12:00:48,542 INFO [main] session.SessionState: Updating thread name to f0fd81a8-0d73-43d0-814e-bda0253c132a main
- 2023-08-20T12:00:48,544 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Compiling command(queryId=infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06): select count(*) from yahoo_table
- 2023-08-20T12:00:48,649 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Starting Semantic Analysis
- 2023-08-20T12:00:48,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Completed phase 1 of Semantic Analysis
- 2023-08-20T12:00:48,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for source tables
- 2023-08-20T12:00:48,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_table : db=stock tbl=yahoo_table
- 2023-08-20T12:00:48,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_table : db=stock tbl=yahoo_table
- 2023-08-20T12:00:48,871 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for subqueries
- 2023-08-20T12:00:48,885 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for destination tables
- 2023-08-20T12:00:48,985 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Context: New scratch dir is hdfs://localhost:9000/tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a/hive_2023-08-20_12-00-48_568_8196585506508880469-1
- 2023-08-20T12:00:48,994 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Completed getting MetaData in Semantic Analysis
- 2023-08-20T12:00:53,255 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for source tables
- 2023-08-20T12:00:53,256 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_table : db=stock tbl=yahoo_table
- 2023-08-20T12:00:53,256 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_table : db=stock tbl=yahoo_table
- 2023-08-20T12:00:53,368 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for subqueries
- 2023-08-20T12:00:53,368 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for destination tables
- 2023-08-20T12:00:53,385 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Context: New scratch dir is hdfs://localhost:9000/tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a/hive_2023-08-20_12-00-48_568_8196585506508880469-1
- 2023-08-20T12:00:53,555 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] common.FileUtils: Creating directory if it doesn't exist: hdfs://localhost:9000/tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a/hive_2023-08-20_12-00-48_568_8196585506508880469-1/-mr-10001/.hive-staging_hive_2023-08-20_12-00-48_568_8196585506508880469-1
- 2023-08-20T12:00:54,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: CBO Succeeded; optimized logical plan.
- 2023-08-20T12:00:54,429 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for FS(6)
- 2023-08-20T12:00:54,430 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for SEL(5)
- 2023-08-20T12:00:54,430 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for GBY(4)
- 2023-08-20T12:00:54,431 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for RS(3)
- 2023-08-20T12:00:54,431 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for GBY(2)
- 2023-08-20T12:00:54,431 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for SEL(1)
- 2023-08-20T12:00:54,431 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for TS(0)
- 2023-08-20T12:00:54,477 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] optimizer.ColumnPrunerProcFactory: RS 3 oldColExprMap: {VALUE._col0=Column[_col0]}
- 2023-08-20T12:00:54,480 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] optimizer.ColumnPrunerProcFactory: RS 3 newColExprMap: {VALUE._col0=Column[_col0]}
- 2023-08-20T12:00:54,943 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SetSparkReducerParallelism: Number of reducers for sink RS[3] was already determined to be: 1
- 2023-08-20T12:00:55,057 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Completed plan generation
- 2023-08-20T12:00:55,058 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Semantic Analysis Completed
- 2023-08-20T12:00:55,058 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_c0, type:bigint, comment:null)], properties:null)
- 2023-08-20T12:00:55,082 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] exec.ListSinkOperator: Initializing operator LIST_SINK[7]
- 2023-08-20T12:00:55,096 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Completed compiling command(queryId=infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06); Time taken: 6.551 seconds
- 2023-08-20T12:00:55,096 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
- 2023-08-20T12:00:55,096 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Executing command(queryId=infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06): select count(*) from yahoo_table
- 2023-08-20T12:00:55,097 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Query ID = infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06
- 2023-08-20T12:00:55,097 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Total jobs = 1
- 2023-08-20T12:00:55,114 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Launching Job 1 out of 1
- 2023-08-20T12:00:55,119 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Starting task [Stage-1:MAPRED] in serial mode
- 2023-08-20T12:00:55,119 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: In order to change the average load for a reducer (in bytes):
- 2023-08-20T12:00:55,120 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: set hive.exec.reducers.bytes.per.reducer=<number>
- 2023-08-20T12:00:55,120 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: In order to limit the maximum number of reducers:
- 2023-08-20T12:00:55,120 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: set hive.exec.reducers.max=<number>
- 2023-08-20T12:00:55,121 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: In order to set a constant number of reducers:
- 2023-08-20T12:00:55,121 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: set mapreduce.job.reduces=<number>
- 2023-08-20T12:00:55,154 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] session.SparkSessionManagerImpl: Setting up the session manager.
- 2023-08-20T12:00:55,266 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
- 2023-08-20T12:00:55,266 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
- 2023-08-20T12:00:55,268 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.hfilecleaner.plugins -> org.apache.hadoop.hbase.master.cleaner.TimeToLiveHFileCleaner).
- 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.regionSplitLimit -> 1000).
- 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.timeout.period -> 60000).
- 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rootdir -> /tmp/hbase-infernus/hbase).
- 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.majorcompaction -> 604800000).
- 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.logroll.period -> 3600000).
- 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compactionThreshold -> 3).
- 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.bytes.per.checksum -> 16384).
- 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.client.fallback-to-simple-auth-allowed -> false).
- 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.perregion.tasks -> 1).
- 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.max.filesize -> 10737418240).
- 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.region.split.policy -> org.apache.hadoop.hbase.regionserver.IncreasingToUpperBoundRegionSplitPolicy).
- 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.thread.wakefrequency -> 10000).
- 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.majorcompaction.jitter -> 0.50).
- 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.abortonerror -> true).
- 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.logcleaner.ttl -> 600000).
- 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load spark property from hive configuration (spark.master -> spark://localhost:4040).
- 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.parent -> /hbase).
- 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.scan.ratio -> 0).
- 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.rpc.threads -> 8).
- 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.framed.max_frame_size_in_mb -> 2).
- 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.filter.classes -> org.apache.hadoop.hbase.rest.filter.GzipFilter).
- 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.port -> 16000).
- 2023-08-20T12:00:55,273 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.connect.timeout -> 9000000).
- 2023-08-20T12:00:55,273 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.dynamic.jars.dir -> /tmp/hbase-infernus/hbase/lib).
- 2023-08-20T12:00:55,273 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.catalog.timeout -> 600000).
- 2023-08-20T12:00:55,273 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.blockingStoreFiles -> 10).
- 2023-08-20T12:00:55,274 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.checksum.algorithm -> CRC32).
- 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.optionalcacheflushinterval -> 3600000).
- 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rpc.timeout -> 60000).
- 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.clientPort -> 2181).
- 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.syncLimit -> 5).
- 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.catalog.timeout -> 600000).
- 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.write.buffer -> 2097152).
- 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.restore.take.failsafe.snapshot -> true).
- 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.bulkload.staging.dir -> /user/infernus/hbase-staging).
- 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.rootserver -> root-region-server).
- 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.port -> 8080).
- 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rootdir.perms -> 700).
- 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.localityCheck.threadPoolSize -> 2).
- 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.auth.token.max.lifetime -> 604800000).
- 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.dns.nameserver -> default).
- 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.enabled -> true).
- 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.cells.scanned.per.heartbeat.check -> 10000).
- 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load spark property from hive configuration (spark.submit.deployMode -> cluster).
- 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regions.slop -> 0.2).
- 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.useMulti -> true).
- 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.staticuser.user -> dr.stack).
- 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.table.max.rowsize -> 1073741824).
- 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.blockingWaitTime -> 90000).
- 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.quorum -> localhost).
- 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rpc.shortoperation.timeout -> 10000).
- 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.framed -> false).
- 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.tmp.dir -> /tmp/hbase-infernus).
- 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.initLimit -> 10).
- 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.compact -> false).
- 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.info.bindAddress -> 0.0.0.0).
- 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.lease.recovery.timeout -> 900000).
- 2023-08-20T12:00:55,281 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.maxClientCnxns -> 300).
- 2023-08-20T12:00:55,281 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.online.schema.update.enable -> true).
- 2023-08-20T12:00:55,281 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.exec.permission.checks -> false).
- 2023-08-20T12:00:55,281 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.peerport -> 2888).
- 2023-08-20T12:00:55,282 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.dfs.client.read.shortcircuit.buffer.size -> 131072).
- 2023-08-20T12:00:55,282 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.region.replica.replication.enabled -> false).
- 2023-08-20T12:00:55,298 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.readonly -> false).
- 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.config.read.zookeeper.config -> false).
- 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.restore.failsafe.name -> hbase-failsafe-{snapshot.name}-{restore.timestamp}).
- 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.max.result.size -> 2097152).
- 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.metrics.showTableName -> true).
- 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.filter.initializers -> org.apache.hadoop.hbase.http.lib.StaticUserWebFilter).
- 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.perserver.tasks -> 5).
- 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.threads.min -> 2).
- 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.keyvalue.maxsize -> 10485760).
- 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.storescanner.parallel.seek.enable -> false).
- 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.secret.bits -> 256).
- 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.column.max.version -> 1).
- 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.loadbalancer.class -> org.apache.hadoop.hbase.master.balancer.StochasticLoadBalancer).
- 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.dns.interface -> default).
- 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.handler.count -> 30).
- 2023-08-20T12:00:55,301 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.defaults.for.version.skip -> false).
- 2023-08-20T12:00:55,301 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.compactchecker.interval.multiplier -> 1000).
- 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.distributed.log.replay -> false).
- 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.port -> 16030).
- 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.storefile.refresh.period -> 0).
- 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.threads.max -> 100).
- 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.balancer.period -> 300000).
- 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.bindAddress -> 0.0.0.0).
- 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.logcleaner.plugins -> org.apache.hadoop.hbase.master.cleaner.TimeToLiveLogCleaner).
- 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.authentication -> simple).
- 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.auth.key.update.interval -> 86400000).
- 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.support.proxyuser -> false).
- 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.cluster.distributed -> false).
- 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.fs.tmp.dir -> /user/infernus/hbase-staging).
- 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.scanner.max.result.size -> 104857600).
- 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.maxQueuedRequests -> 1000).
- 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.total.tasks -> 100).
- 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.port.auto -> false).
- 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.server.connect.timeout -> 90000).
- 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.mslab.enabled -> true).
- 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.port -> 16020).
- 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.dns.interface -> default).
- 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.lease.recovery.dfs.timeout -> 64000).
- 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.table.lock.enable -> true).
- 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.percolumnfamilyflush.size.lower.bound -> 16777216).
- 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.read.ratio -> 0).
- 2023-08-20T12:00:55,307 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.logroll.errors.tolerated -> 2).
- 2023-08-20T12:00:55,308 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.max.threads -> 10).
- 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.multicast.address.ip -> 226.1.1.3).
- 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.dns.nameserver -> default).
- 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.multicast.address.port -> 16100).
- 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.publisher.class -> org.apache.hadoop.hbase.master.ClusterStatusPublisher$MulticastPublisher).
- 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.published -> false).
- 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.retries.number -> 35).
- 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.listener.class -> org.apache.hadoop.hbase.client.ClusterStatusListener$MulticastListener).
- 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.caching -> 2147483647).
- 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.dataDir -> /tmp/hbase-infernus/zookeeper).
- 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.local.dir -> /tmp/hbase-infernus/local/).
- 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.visibility.mutations.checkauths -> false).
- 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.msginterval -> 3000).
- 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.acl.parent -> acl).
- 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.pause -> 100).
- 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rs.cacheblocksonwrite -> false).
- 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.enabled -> true).
- 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.data.umask.enable -> false).
- 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.checksum.verify -> true).
- 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.handler.factor -> 0.1).
- 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.block.multiplier -> 4).
- 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.time.to.purge.deletes -> 0).
- 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.htablepool.size.max -> 1000).
- 2023-08-20T12:00:55,312 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.replication.rpc.codec -> org.apache.hadoop.hbase.codec.KeyValueCodecWithTags).
- 2023-08-20T12:00:55,315 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compaction.kv.max -> 10).
- 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compaction.max -> 10).
- 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.maxWorkerThreads -> 1000).
- 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.data.umask -> 000).
- 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.user.enabled -> true).
- 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.flusher.count -> 2).
- 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.metrics.exposeOperationTimes -> true).
- 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.client.tcpnodelay -> true).
- 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.rpc.max.size -> 52428800).
- 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.flush.size -> 134217728).
- 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.bulkload.retries.number -> 10).
- 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.hlog.reader.impl -> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader).
- 2023-08-20T12:00:55,318 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.handler.abort.on.error.percent -> 0.5).
- 2023-08-20T12:00:55,318 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coordinated.state.manager.class -> org.apache.hadoop.hbase.coordination.ZkCoordinatedStateManager).
- 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.versionfile.writeattempts -> 3).
- 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.defaults.for.version -> 1.1.1).
- 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.hlog.writer.impl -> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogWriter).
- 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.leaderport -> 3888).
- 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.minWorkerThreads -> 16).
- 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.storescanner.parallel.seek.threads -> 10).
- 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.preclose.flush.size -> 5242880).
- 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.info.port -> 16010).
- 2023-08-20T12:00:55,320 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.infoserver.redirect -> true).
- 2023-08-20T12:00:55,649 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.hfilecleaner.plugins -> org.apache.hadoop.hbase.master.cleaner.TimeToLiveHFileCleaner).
- 2023-08-20T12:00:55,650 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.regionSplitLimit -> 1000).
- 2023-08-20T12:00:55,650 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.timeout.period -> 60000).
- 2023-08-20T12:00:55,650 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rootdir -> /tmp/hbase-infernus/hbase).
- 2023-08-20T12:00:55,650 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.majorcompaction -> 604800000).
- 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.logroll.period -> 3600000).
- 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compactionThreshold -> 3).
- 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.bytes.per.checksum -> 16384).
- 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.client.fallback-to-simple-auth-allowed -> false).
- 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.perregion.tasks -> 1).
- 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.max.filesize -> 10737418240).
- 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.region.split.policy -> org.apache.hadoop.hbase.regionserver.IncreasingToUpperBoundRegionSplitPolicy).
- 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.thread.wakefrequency -> 10000).
- 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.majorcompaction.jitter -> 0.50).
- 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.abortonerror -> true).
- 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.logcleaner.ttl -> 600000).
- 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load spark property from hive configuration (spark.master -> spark://localhost:4040).
- 2023-08-20T12:00:55,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.parent -> /hbase).
- 2023-08-20T12:00:55,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.scan.ratio -> 0).
- 2023-08-20T12:00:55,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.rpc.threads -> 8).
- 2023-08-20T12:00:55,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.framed.max_frame_size_in_mb -> 2).
- 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.filter.classes -> org.apache.hadoop.hbase.rest.filter.GzipFilter).
- 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.port -> 16000).
- 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.connect.timeout -> 9000000).
- 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.dynamic.jars.dir -> /tmp/hbase-infernus/hbase/lib).
- 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.catalog.timeout -> 600000).
- 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.blockingStoreFiles -> 10).
- 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.checksum.algorithm -> CRC32).
- 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.optionalcacheflushinterval -> 3600000).
- 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rpc.timeout -> 60000).
- 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.clientPort -> 2181).
- 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.syncLimit -> 5).
- 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.catalog.timeout -> 600000).
- 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.write.buffer -> 2097152).
- 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.restore.take.failsafe.snapshot -> true).
- 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.bulkload.staging.dir -> /user/infernus/hbase-staging).
- 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.rootserver -> root-region-server).
- 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.port -> 8080).
- 2023-08-20T12:00:55,660 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rootdir.perms -> 700).
- 2023-08-20T12:00:55,660 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.localityCheck.threadPoolSize -> 2).
- 2023-08-20T12:00:55,660 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.auth.token.max.lifetime -> 604800000).
- 2023-08-20T12:00:55,660 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.dns.nameserver -> default).
- 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.enabled -> true).
- 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.cells.scanned.per.heartbeat.check -> 10000).
- 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load spark property from hive configuration (spark.submit.deployMode -> cluster).
- 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regions.slop -> 0.2).
- 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.useMulti -> true).
- 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.staticuser.user -> dr.stack).
- 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.table.max.rowsize -> 1073741824).
- 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.blockingWaitTime -> 90000).
- 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.quorum -> localhost).
- 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rpc.shortoperation.timeout -> 10000).
- 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.framed -> false).
- 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.tmp.dir -> /tmp/hbase-infernus).
- 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.initLimit -> 10).
- 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.compact -> false).
- 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.info.bindAddress -> 0.0.0.0).
- 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.lease.recovery.timeout -> 900000).
- 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.maxClientCnxns -> 300).
- 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.online.schema.update.enable -> true).
- 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.exec.permission.checks -> false).
- 2023-08-20T12:00:55,664 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.peerport -> 2888).
- 2023-08-20T12:00:55,664 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.dfs.client.read.shortcircuit.buffer.size -> 131072).
- 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.region.replica.replication.enabled -> false).
- 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.readonly -> false).
- 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.config.read.zookeeper.config -> false).
- 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.restore.failsafe.name -> hbase-failsafe-{snapshot.name}-{restore.timestamp}).
- 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.max.result.size -> 2097152).
- 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.metrics.showTableName -> true).
- 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.filter.initializers -> org.apache.hadoop.hbase.http.lib.StaticUserWebFilter).
- 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.perserver.tasks -> 5).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.threads.min -> 2).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.keyvalue.maxsize -> 10485760).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.storescanner.parallel.seek.enable -> false).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.secret.bits -> 256).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.column.max.version -> 1).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.loadbalancer.class -> org.apache.hadoop.hbase.master.balancer.StochasticLoadBalancer).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.dns.interface -> default).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.handler.count -> 30).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.defaults.for.version.skip -> false).
- 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.compactchecker.interval.multiplier -> 1000).
- 2023-08-20T12:00:55,667 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.distributed.log.replay -> false).
- 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.port -> 16030).
- 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.storefile.refresh.period -> 0).
- 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.threads.max -> 100).
- 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.balancer.period -> 300000).
- 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.bindAddress -> 0.0.0.0).
- 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.logcleaner.plugins -> org.apache.hadoop.hbase.master.cleaner.TimeToLiveLogCleaner).
- 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.authentication -> simple).
- 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.auth.key.update.interval -> 86400000).
- 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.support.proxyuser -> false).
- 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.cluster.distributed -> false).
- 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.fs.tmp.dir -> /user/infernus/hbase-staging).
- 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.scanner.max.result.size -> 104857600).
- 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.maxQueuedRequests -> 1000).
- 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.total.tasks -> 100).
- 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.port.auto -> false).
- 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.server.connect.timeout -> 90000).
- 2023-08-20T12:00:55,672 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.mslab.enabled -> true).
- 2023-08-20T12:00:55,672 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.port -> 16020).
- 2023-08-20T12:00:55,672 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.dns.interface -> default).
- 2023-08-20T12:00:55,673 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.lease.recovery.dfs.timeout -> 64000).
- 2023-08-20T12:00:55,673 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.table.lock.enable -> true).
- 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.percolumnfamilyflush.size.lower.bound -> 16777216).
- 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.read.ratio -> 0).
- 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.logroll.errors.tolerated -> 2).
- 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.max.threads -> 10).
- 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.multicast.address.ip -> 226.1.1.3).
- 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.dns.nameserver -> default).
- 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.multicast.address.port -> 16100).
- 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.publisher.class -> org.apache.hadoop.hbase.master.ClusterStatusPublisher$MulticastPublisher).
- 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.published -> false).
- 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.retries.number -> 35).
- 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.listener.class -> org.apache.hadoop.hbase.client.ClusterStatusListener$MulticastListener).
- 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.caching -> 2147483647).
- 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.dataDir -> /tmp/hbase-infernus/zookeeper).
- 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.local.dir -> /tmp/hbase-infernus/local/).
- 2023-08-20T12:00:55,677 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.visibility.mutations.checkauths -> false).
- 2023-08-20T12:00:55,677 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.msginterval -> 3000).
- 2023-08-20T12:00:55,677 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.acl.parent -> acl).
- 2023-08-20T12:00:55,677 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.pause -> 100).
- 2023-08-20T12:00:55,678 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rs.cacheblocksonwrite -> false).
- 2023-08-20T12:00:55,678 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.enabled -> true).
- 2023-08-20T12:00:55,678 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.data.umask.enable -> false).
- 2023-08-20T12:00:55,678 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.checksum.verify -> true).
- 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.handler.factor -> 0.1).
- 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.block.multiplier -> 4).
- 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.time.to.purge.deletes -> 0).
- 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.htablepool.size.max -> 1000).
- 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.replication.rpc.codec -> org.apache.hadoop.hbase.codec.KeyValueCodecWithTags).
- 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compaction.kv.max -> 10).
- 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compaction.max -> 10).
- 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.maxWorkerThreads -> 1000).
- 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.data.umask -> 000).
- 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.user.enabled -> true).
- 2023-08-20T12:00:55,682 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.flusher.count -> 2).
- 2023-08-20T12:00:55,683 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.metrics.exposeOperationTimes -> true).
- 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.client.tcpnodelay -> true).
- 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.rpc.max.size -> 52428800).
- 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.flush.size -> 134217728).
- 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.bulkload.retries.number -> 10).
- 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.hlog.reader.impl -> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader).
- 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.handler.abort.on.error.percent -> 0.5).
- 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coordinated.state.manager.class -> org.apache.hadoop.hbase.coordination.ZkCoordinatedStateManager).
- 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.versionfile.writeattempts -> 3).
- 2023-08-20T12:00:55,685 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.defaults.for.version -> 1.1.1).
- 2023-08-20T12:00:55,685 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.hlog.writer.impl -> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogWriter).
- 2023-08-20T12:00:55,685 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.leaderport -> 3888).
- 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.minWorkerThreads -> 16).
- 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.storescanner.parallel.seek.threads -> 10).
- 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.preclose.flush.size -> 5242880).
- 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.info.port -> 16010).
- 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.infoserver.redirect -> true).
- 2023-08-20T12:00:56,164 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] client.SparkClientImpl: Running client driver with argv: /home/infernus/spark-3.3.2-bin-hadoop3/bin/spark-submit --properties-file /tmp/spark-submit.7742744567511501757.properties --class org.apache.hive.spark.client.RemoteDriver /home/infernus/hive-2.3.9/lib/hive-exec-2.3.9.jar --remote-host infernuspc --remote-port 39847 --conf hive.spark.client.connect.timeout=9000000 --conf hive.spark.client.server.connect.timeout=90000 --conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 --conf hive.spark.client.rpc.server.address=null
- 2023-08-20T12:01:00,981 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
- 2023-08-20T12:01:00,984 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
- 2023-08-20T12:01:00,984 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
- 2023-08-20T12:01:00,984 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
- 2023-08-20T12:01:00,984 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
- 2023-08-20T12:01:01,316 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:01 WARN Utils: Your hostname, infernuspc resolves to a loopback address: 127.0.1.1; using 10.0.3.15 instead (on interface enp0s8)
- 2023-08-20T12:01:01,323 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:01 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
- 2023-08-20T12:01:02,982 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
- 2023-08-20T12:01:03,132 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: Changing view acls to: infernus
- 2023-08-20T12:01:03,134 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: Changing modify acls to: infernus
- 2023-08-20T12:01:03,135 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: Changing view acls groups to:
- 2023-08-20T12:01:03,139 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: Changing modify acls groups to:
- 2023-08-20T12:01:03,141 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(infernus); groups with view permissions: Set(); users with modify permissions: Set(infernus); groups with modify permissions: Set()
- 2023-08-20T12:01:04,135 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 INFO Utils: Successfully started service 'driverClient' on port 42723.
- 2023-08-20T12:01:04,312 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 INFO TransportClientFactory: Successfully created connection to localhost/127.0.0.1:4040 after 91 ms (0 ms spent in bootstraps)
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 WARN TransportChannelHandler: Exception in connection from localhost/127.0.0.1:4040
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at java.lang.Thread.run(Thread.java:750)
- 2023-08-20T12:01:04,381 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from localhost/127.0.0.1:4040 is closed
- 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
- 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
- 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
- 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
- 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.TraversableLike.map(TraversableLike.scala:286)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
- 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
- 2023-08-20T12:01:04,391 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at java.lang.Thread.run(Thread.java:750)
- 2023-08-20T12:01:04,418 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 INFO ShutdownHookManager: Shutdown hook called
- 2023-08-20T12:01:04,420 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 INFO ShutdownHookManager: Deleting directory /tmp/spark-15d0e454-792d-44a3-879c-7189721907ad
- 2023-08-20T12:01:04,861 ERROR [f0fd81a8-0d73-43d0-814e-bda0253c132a main] client.SparkClientImpl: Error while waiting for client to connect.
- java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
- Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
- Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
- at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
- at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
- at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
- at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
- at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
- at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
- at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
- at scala.collection.TraversableLike.map(TraversableLike.scala:286)
- at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
- at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
- at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- at java.lang.Thread.run(Thread.java:750)
- at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]
- at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:109) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:126) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:103) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) ~[hive-cli-2.3.9.jar:2.3.9]
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_382]
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_382]
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_382]
- at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_382]
- at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.3.1.jar:?]
- at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.3.1.jar:?]
- Caused by: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
- Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
- Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
- at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
- at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
- at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
- at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
- at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
- at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
- at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
- at scala.collection.TraversableLike.map(TraversableLike.scala:286)
- at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
- at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
- at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- at java.lang.Thread.run(Thread.java:750)
- at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:212) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:503) ~[hive-exec-2.3.9.jar:2.3.9]
- at java.lang.Thread.run(Thread.java:750) [?:1.8.0_382]
- 2023-08-20T12:01:04,861 WARN [Driver] client.SparkClientImpl: Child process exited with code 1
- 2023-08-20T12:01:04,896 ERROR [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
- org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.
- at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:64)
- at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
- at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:126)
- at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:103)
- at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)
- at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
- at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)
- at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)
- at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)
- at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
- at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
- at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
- at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
- at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
- at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
- at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
- at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- at java.lang.reflect.Method.invoke(Method.java:498)
- at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
- at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
- Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
- Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
- Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
- at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
- at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
- at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
- at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
- at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
- at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
- at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
- at scala.collection.TraversableLike.map(TraversableLike.scala:286)
- at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
- at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
- at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- at java.lang.Thread.run(Thread.java:750)
- at org.apache.hive.com.google.common.base.Throwables.propagate(Throwables.java:160)
- at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:125)
- at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
- at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101)
- at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:97)
- at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73)
- at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62)
- ... 22 more
- Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
- Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
- Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
- at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
- at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
- at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
- at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
- at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
- at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
- at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
- at scala.collection.TraversableLike.map(TraversableLike.scala:286)
- at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
- at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
- at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- at java.lang.Thread.run(Thread.java:750)
- at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41)
- at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:109)
- ... 27 more
- Caused by: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
- Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
- Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
- at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
- at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
- at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
- at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
- at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
- at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
- at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
- at scala.collection.TraversableLike.map(TraversableLike.scala:286)
- at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
- at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
- at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- at java.lang.Thread.run(Thread.java:750)
- at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:212)
- at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:503)
- at java.lang.Thread.run(Thread.java:750)
- 2023-08-20T12:01:04,896 ERROR [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
- org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.
- at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:64) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:126) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:103) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) ~[hive-cli-2.3.9.jar:2.3.9]
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_382]
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_382]
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_382]
- at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_382]
- at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.3.1.jar:?]
- at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.3.1.jar:?]
- Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
- Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
- Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
- at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
- at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
- at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
- at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
- at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
- at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
- at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
- at scala.collection.TraversableLike.map(TraversableLike.scala:286)
- at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
- at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
- at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- at java.lang.Thread.run(Thread.java:750)
- at org.apache.hive.com.google.common.base.Throwables.propagate(Throwables.java:160) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:125) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.9.jar:2.3.9]
- ... 22 more
- Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
- Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
- Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
- at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
- at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
- at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
- at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
- at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
- at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
- at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
- at scala.collection.TraversableLike.map(TraversableLike.scala:286)
- at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
- at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
- at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- at java.lang.Thread.run(Thread.java:750)
- at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]
- at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:109) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.9.jar:2.3.9]
- ... 22 more
- Caused by: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
- Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
- Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
- Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
- Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
- at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
- at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
- at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
- at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
- at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
- at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
- at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
- at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
- at scala.collection.TraversableLike.map(TraversableLike.scala:286)
- at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
- at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
- at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
- at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
- at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
- at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
- at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
- at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
- at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
- at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
- Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
- at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
- at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
- at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
- at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
- at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
- at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
- at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
- at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
- at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
- at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
- at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
- at java.lang.Thread.run(Thread.java:750)
- at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:212) ~[hive-exec-2.3.9.jar:2.3.9]
- at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:503) ~[hive-exec-2.3.9.jar:2.3.9]
- at java.lang.Thread.run(Thread.java:750) [?:1.8.0_382]
- 2023-08-20T12:01:04,897 ERROR [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create spark client.
- 2023-08-20T12:01:04,897 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Completed executing command(queryId=infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06); Time taken: 9.801 seconds
- 2023-08-20T12:01:04,899 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] exec.ListSinkOperator: Closing operator LIST_SINK[7]
- 2023-08-20T12:01:04,984 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
- 2023-08-20T12:01:04,984 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] session.SessionState: Resetting thread name to main
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement