Advertisement
Guest User

hive error logs

a guest
Aug 20th, 2023
65
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 126.34 KB | Software | 0 0
  1. 2023-08-20T12:00:31,469 INFO [main] conf.HiveConf: Found configuration file file:/home/infernus/hive-2.3.9/conf/hive-site.xml
  2. 2023-08-20T12:00:32,339 WARN [main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
  3. 2023-08-20T12:00:32,342 WARN [main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
  4. 2023-08-20T12:00:32,559 WARN [main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
  5. 2023-08-20T12:00:32,560 WARN [main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
  6. 2023-08-20T12:00:32,675 INFO [main] SessionState:
  7. Logging initialized using configuration in file:/home/infernus/hive-2.3.9/conf/hive-log4j2.properties Async: true
  8. 2023-08-20T12:00:32,836 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  9. 2023-08-20T12:00:34,855 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a
  10. 2023-08-20T12:00:34,920 INFO [main] session.SessionState: Created local directory: /tmp/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a
  11. 2023-08-20T12:00:34,936 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a/_tmp_space.db
  12. 2023-08-20T12:00:34,967 INFO [main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
  13. 2023-08-20T12:00:34,970 INFO [main] session.SessionState: Updating thread name to f0fd81a8-0d73-43d0-814e-bda0253c132a main
  14. 2023-08-20T12:00:38,948 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
  15. 2023-08-20T12:00:39,026 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Compiling command(queryId=infernus_20230820120038_4eccc5c8-33bd-496c-b0c2-562457c90a4c): use stock
  16. 2023-08-20T12:00:39,914 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
  17. 2023-08-20T12:00:39,915 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
  18. 2023-08-20T12:00:39,916 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
  19. 2023-08-20T12:00:39,995 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: ObjectStore, initialize called
  20. 2023-08-20T12:00:42,539 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
  21. 2023-08-20T12:00:42,542 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
  22. 2023-08-20T12:00:42,546 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
  23. 2023-08-20T12:00:46,108 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is POSTGRES
  24. 2023-08-20T12:00:46,115 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: Initialized ObjectStore
  25. 2023-08-20T12:00:46,348 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: Added admin role in metastore
  26. 2023-08-20T12:00:46,359 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: Added public role in metastore
  27. 2023-08-20T12:00:46,394 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: No user is added in admin role, since config is empty
  28. 2023-08-20T12:00:46,570 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_all_functions
  29. 2023-08-20T12:00:46,571 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_all_functions
  30. 2023-08-20T12:00:46,606 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_database: stock
  31. 2023-08-20T12:00:46,607 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_database: stock
  32. 2023-08-20T12:00:46,641 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Semantic Analysis Completed
  33. 2023-08-20T12:00:46,645 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
  34. 2023-08-20T12:00:46,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Completed compiling command(queryId=infernus_20230820120038_4eccc5c8-33bd-496c-b0c2-562457c90a4c); Time taken: 7.671 seconds
  35. 2023-08-20T12:00:46,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
  36. 2023-08-20T12:00:46,664 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Executing command(queryId=infernus_20230820120038_4eccc5c8-33bd-496c-b0c2-562457c90a4c): use stock
  37. 2023-08-20T12:00:46,732 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] sqlstd.SQLStdHiveAccessController: Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=f0fd81a8-0d73-43d0-814e-bda0253c132a, clientType=HIVECLI]
  38. 2023-08-20T12:00:46,737 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] session.SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
  39. 2023-08-20T12:00:46,737 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] hive.metastore: Mestastore configuration hive.metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
  40. 2023-08-20T12:00:46,740 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
  41. 2023-08-20T12:00:46,740 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=Cleaning up thread local RawStore...
  42. 2023-08-20T12:00:46,741 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
  43. 2023-08-20T12:00:46,741 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=Done cleaning up thread local RawStore
  44. 2023-08-20T12:00:46,766 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Starting task [Stage-0:DDL] in serial mode
  45. 2023-08-20T12:00:46,768 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_database: stock
  46. 2023-08-20T12:00:46,768 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_database: stock
  47. 2023-08-20T12:00:46,806 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
  48. 2023-08-20T12:00:46,807 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
  49. 2023-08-20T12:00:46,808 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
  50. 2023-08-20T12:00:46,810 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: ObjectStore, initialize called
  51. 2023-08-20T12:00:46,831 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is POSTGRES
  52. 2023-08-20T12:00:46,831 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.ObjectStore: Initialized ObjectStore
  53. 2023-08-20T12:00:46,837 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_database: stock
  54. 2023-08-20T12:00:46,838 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_database: stock
  55. 2023-08-20T12:00:46,846 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Completed executing command(queryId=infernus_20230820120038_4eccc5c8-33bd-496c-b0c2-562457c90a4c); Time taken: 0.182 seconds
  56. 2023-08-20T12:00:46,848 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: OK
  57. 2023-08-20T12:00:46,852 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
  58. 2023-08-20T12:00:46,852 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] CliDriver: Time taken: 7.87 seconds
  59. 2023-08-20T12:00:46,852 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] session.SessionState: Resetting thread name to main
  60. 2023-08-20T12:00:48,542 INFO [main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
  61. 2023-08-20T12:00:48,542 INFO [main] session.SessionState: Updating thread name to f0fd81a8-0d73-43d0-814e-bda0253c132a main
  62. 2023-08-20T12:00:48,544 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Compiling command(queryId=infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06): select count(*) from yahoo_table
  63. 2023-08-20T12:00:48,649 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Starting Semantic Analysis
  64. 2023-08-20T12:00:48,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Completed phase 1 of Semantic Analysis
  65. 2023-08-20T12:00:48,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for source tables
  66. 2023-08-20T12:00:48,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_table : db=stock tbl=yahoo_table
  67. 2023-08-20T12:00:48,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_table : db=stock tbl=yahoo_table
  68. 2023-08-20T12:00:48,871 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for subqueries
  69. 2023-08-20T12:00:48,885 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for destination tables
  70. 2023-08-20T12:00:48,985 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Context: New scratch dir is hdfs://localhost:9000/tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a/hive_2023-08-20_12-00-48_568_8196585506508880469-1
  71. 2023-08-20T12:00:48,994 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Completed getting MetaData in Semantic Analysis
  72. 2023-08-20T12:00:53,255 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for source tables
  73. 2023-08-20T12:00:53,256 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] metastore.HiveMetaStore: 0: get_table : db=stock tbl=yahoo_table
  74. 2023-08-20T12:00:53,256 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] HiveMetaStore.audit: ugi=infernus ip=unknown-ip-addr cmd=get_table : db=stock tbl=yahoo_table
  75. 2023-08-20T12:00:53,368 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for subqueries
  76. 2023-08-20T12:00:53,368 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Get metadata for destination tables
  77. 2023-08-20T12:00:53,385 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Context: New scratch dir is hdfs://localhost:9000/tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a/hive_2023-08-20_12-00-48_568_8196585506508880469-1
  78. 2023-08-20T12:00:53,555 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] common.FileUtils: Creating directory if it doesn't exist: hdfs://localhost:9000/tmp/hive/infernus/f0fd81a8-0d73-43d0-814e-bda0253c132a/hive_2023-08-20_12-00-48_568_8196585506508880469-1/-mr-10001/.hive-staging_hive_2023-08-20_12-00-48_568_8196585506508880469-1
  79. 2023-08-20T12:00:54,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: CBO Succeeded; optimized logical plan.
  80. 2023-08-20T12:00:54,429 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for FS(6)
  81. 2023-08-20T12:00:54,430 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for SEL(5)
  82. 2023-08-20T12:00:54,430 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for GBY(4)
  83. 2023-08-20T12:00:54,431 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for RS(3)
  84. 2023-08-20T12:00:54,431 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for GBY(2)
  85. 2023-08-20T12:00:54,431 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for SEL(1)
  86. 2023-08-20T12:00:54,431 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ppd.OpProcFactory: Processing for TS(0)
  87. 2023-08-20T12:00:54,477 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] optimizer.ColumnPrunerProcFactory: RS 3 oldColExprMap: {VALUE._col0=Column[_col0]}
  88. 2023-08-20T12:00:54,480 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] optimizer.ColumnPrunerProcFactory: RS 3 newColExprMap: {VALUE._col0=Column[_col0]}
  89. 2023-08-20T12:00:54,943 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SetSparkReducerParallelism: Number of reducers for sink RS[3] was already determined to be: 1
  90. 2023-08-20T12:00:55,057 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] parse.CalcitePlanner: Completed plan generation
  91. 2023-08-20T12:00:55,058 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Semantic Analysis Completed
  92. 2023-08-20T12:00:55,058 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_c0, type:bigint, comment:null)], properties:null)
  93. 2023-08-20T12:00:55,082 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] exec.ListSinkOperator: Initializing operator LIST_SINK[7]
  94. 2023-08-20T12:00:55,096 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Completed compiling command(queryId=infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06); Time taken: 6.551 seconds
  95. 2023-08-20T12:00:55,096 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
  96. 2023-08-20T12:00:55,096 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Executing command(queryId=infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06): select count(*) from yahoo_table
  97. 2023-08-20T12:00:55,097 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Query ID = infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06
  98. 2023-08-20T12:00:55,097 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Total jobs = 1
  99. 2023-08-20T12:00:55,114 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Launching Job 1 out of 1
  100. 2023-08-20T12:00:55,119 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Starting task [Stage-1:MAPRED] in serial mode
  101. 2023-08-20T12:00:55,119 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: In order to change the average load for a reducer (in bytes):
  102. 2023-08-20T12:00:55,120 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: set hive.exec.reducers.bytes.per.reducer=<number>
  103. 2023-08-20T12:00:55,120 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: In order to limit the maximum number of reducers:
  104. 2023-08-20T12:00:55,120 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: set hive.exec.reducers.max=<number>
  105. 2023-08-20T12:00:55,121 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: In order to set a constant number of reducers:
  106. 2023-08-20T12:00:55,121 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: set mapreduce.job.reduces=<number>
  107. 2023-08-20T12:00:55,154 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] session.SparkSessionManagerImpl: Setting up the session manager.
  108. 2023-08-20T12:00:55,266 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.hadoop.configured.resources does not exist
  109. 2023-08-20T12:00:55,266 WARN [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: HiveConf of name hive.metastore.db.type does not exist
  110. 2023-08-20T12:00:55,268 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.hfilecleaner.plugins -> org.apache.hadoop.hbase.master.cleaner.TimeToLiveHFileCleaner).
  111. 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.regionSplitLimit -> 1000).
  112. 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.timeout.period -> 60000).
  113. 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rootdir -> /tmp/hbase-infernus/hbase).
  114. 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.majorcompaction -> 604800000).
  115. 2023-08-20T12:00:55,269 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.logroll.period -> 3600000).
  116. 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compactionThreshold -> 3).
  117. 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.bytes.per.checksum -> 16384).
  118. 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.client.fallback-to-simple-auth-allowed -> false).
  119. 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.perregion.tasks -> 1).
  120. 2023-08-20T12:00:55,270 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.max.filesize -> 10737418240).
  121. 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.region.split.policy -> org.apache.hadoop.hbase.regionserver.IncreasingToUpperBoundRegionSplitPolicy).
  122. 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.thread.wakefrequency -> 10000).
  123. 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.majorcompaction.jitter -> 0.50).
  124. 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.abortonerror -> true).
  125. 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.logcleaner.ttl -> 600000).
  126. 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load spark property from hive configuration (spark.master -> spark://localhost:4040).
  127. 2023-08-20T12:00:55,271 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.parent -> /hbase).
  128. 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.scan.ratio -> 0).
  129. 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.rpc.threads -> 8).
  130. 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.framed.max_frame_size_in_mb -> 2).
  131. 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.filter.classes -> org.apache.hadoop.hbase.rest.filter.GzipFilter).
  132. 2023-08-20T12:00:55,272 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.port -> 16000).
  133. 2023-08-20T12:00:55,273 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.connect.timeout -> 9000000).
  134. 2023-08-20T12:00:55,273 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.dynamic.jars.dir -> /tmp/hbase-infernus/hbase/lib).
  135. 2023-08-20T12:00:55,273 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.catalog.timeout -> 600000).
  136. 2023-08-20T12:00:55,273 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.blockingStoreFiles -> 10).
  137. 2023-08-20T12:00:55,274 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.checksum.algorithm -> CRC32).
  138. 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.optionalcacheflushinterval -> 3600000).
  139. 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rpc.timeout -> 60000).
  140. 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.clientPort -> 2181).
  141. 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.syncLimit -> 5).
  142. 2023-08-20T12:00:55,275 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.catalog.timeout -> 600000).
  143. 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.write.buffer -> 2097152).
  144. 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.restore.take.failsafe.snapshot -> true).
  145. 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.bulkload.staging.dir -> /user/infernus/hbase-staging).
  146. 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.rootserver -> root-region-server).
  147. 2023-08-20T12:00:55,276 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.port -> 8080).
  148. 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rootdir.perms -> 700).
  149. 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.localityCheck.threadPoolSize -> 2).
  150. 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.auth.token.max.lifetime -> 604800000).
  151. 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.dns.nameserver -> default).
  152. 2023-08-20T12:00:55,277 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.enabled -> true).
  153. 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.cells.scanned.per.heartbeat.check -> 10000).
  154. 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load spark property from hive configuration (spark.submit.deployMode -> cluster).
  155. 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regions.slop -> 0.2).
  156. 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.useMulti -> true).
  157. 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.staticuser.user -> dr.stack).
  158. 2023-08-20T12:00:55,279 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.table.max.rowsize -> 1073741824).
  159. 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.blockingWaitTime -> 90000).
  160. 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.quorum -> localhost).
  161. 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rpc.shortoperation.timeout -> 10000).
  162. 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.framed -> false).
  163. 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.tmp.dir -> /tmp/hbase-infernus).
  164. 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.initLimit -> 10).
  165. 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.compact -> false).
  166. 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.info.bindAddress -> 0.0.0.0).
  167. 2023-08-20T12:00:55,280 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.lease.recovery.timeout -> 900000).
  168. 2023-08-20T12:00:55,281 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.maxClientCnxns -> 300).
  169. 2023-08-20T12:00:55,281 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.online.schema.update.enable -> true).
  170. 2023-08-20T12:00:55,281 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.exec.permission.checks -> false).
  171. 2023-08-20T12:00:55,281 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.peerport -> 2888).
  172. 2023-08-20T12:00:55,282 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.dfs.client.read.shortcircuit.buffer.size -> 131072).
  173. 2023-08-20T12:00:55,282 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.region.replica.replication.enabled -> false).
  174. 2023-08-20T12:00:55,298 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.readonly -> false).
  175. 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.config.read.zookeeper.config -> false).
  176. 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.restore.failsafe.name -> hbase-failsafe-{snapshot.name}-{restore.timestamp}).
  177. 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.max.result.size -> 2097152).
  178. 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.metrics.showTableName -> true).
  179. 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.filter.initializers -> org.apache.hadoop.hbase.http.lib.StaticUserWebFilter).
  180. 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.perserver.tasks -> 5).
  181. 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.threads.min -> 2).
  182. 2023-08-20T12:00:55,299 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.keyvalue.maxsize -> 10485760).
  183. 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.storescanner.parallel.seek.enable -> false).
  184. 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.secret.bits -> 256).
  185. 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.column.max.version -> 1).
  186. 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.loadbalancer.class -> org.apache.hadoop.hbase.master.balancer.StochasticLoadBalancer).
  187. 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.dns.interface -> default).
  188. 2023-08-20T12:00:55,300 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.handler.count -> 30).
  189. 2023-08-20T12:00:55,301 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.defaults.for.version.skip -> false).
  190. 2023-08-20T12:00:55,301 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.compactchecker.interval.multiplier -> 1000).
  191. 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.distributed.log.replay -> false).
  192. 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.port -> 16030).
  193. 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.storefile.refresh.period -> 0).
  194. 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.threads.max -> 100).
  195. 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.balancer.period -> 300000).
  196. 2023-08-20T12:00:55,303 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.bindAddress -> 0.0.0.0).
  197. 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.logcleaner.plugins -> org.apache.hadoop.hbase.master.cleaner.TimeToLiveLogCleaner).
  198. 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.authentication -> simple).
  199. 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.auth.key.update.interval -> 86400000).
  200. 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.support.proxyuser -> false).
  201. 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.cluster.distributed -> false).
  202. 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.fs.tmp.dir -> /user/infernus/hbase-staging).
  203. 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.scanner.max.result.size -> 104857600).
  204. 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.maxQueuedRequests -> 1000).
  205. 2023-08-20T12:00:55,304 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.total.tasks -> 100).
  206. 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.port.auto -> false).
  207. 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.server.connect.timeout -> 90000).
  208. 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.mslab.enabled -> true).
  209. 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.port -> 16020).
  210. 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.dns.interface -> default).
  211. 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.lease.recovery.dfs.timeout -> 64000).
  212. 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.table.lock.enable -> true).
  213. 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.percolumnfamilyflush.size.lower.bound -> 16777216).
  214. 2023-08-20T12:00:55,305 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.read.ratio -> 0).
  215. 2023-08-20T12:00:55,307 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.logroll.errors.tolerated -> 2).
  216. 2023-08-20T12:00:55,308 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.max.threads -> 10).
  217. 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.multicast.address.ip -> 226.1.1.3).
  218. 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.dns.nameserver -> default).
  219. 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.multicast.address.port -> 16100).
  220. 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.publisher.class -> org.apache.hadoop.hbase.master.ClusterStatusPublisher$MulticastPublisher).
  221. 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.published -> false).
  222. 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.retries.number -> 35).
  223. 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.listener.class -> org.apache.hadoop.hbase.client.ClusterStatusListener$MulticastListener).
  224. 2023-08-20T12:00:55,309 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.caching -> 2147483647).
  225. 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.dataDir -> /tmp/hbase-infernus/zookeeper).
  226. 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.local.dir -> /tmp/hbase-infernus/local/).
  227. 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.visibility.mutations.checkauths -> false).
  228. 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.msginterval -> 3000).
  229. 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.acl.parent -> acl).
  230. 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.pause -> 100).
  231. 2023-08-20T12:00:55,310 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rs.cacheblocksonwrite -> false).
  232. 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.enabled -> true).
  233. 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.data.umask.enable -> false).
  234. 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.checksum.verify -> true).
  235. 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.handler.factor -> 0.1).
  236. 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.block.multiplier -> 4).
  237. 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.time.to.purge.deletes -> 0).
  238. 2023-08-20T12:00:55,311 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.htablepool.size.max -> 1000).
  239. 2023-08-20T12:00:55,312 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.replication.rpc.codec -> org.apache.hadoop.hbase.codec.KeyValueCodecWithTags).
  240. 2023-08-20T12:00:55,315 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compaction.kv.max -> 10).
  241. 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compaction.max -> 10).
  242. 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.maxWorkerThreads -> 1000).
  243. 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.data.umask -> 000).
  244. 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.user.enabled -> true).
  245. 2023-08-20T12:00:55,316 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.flusher.count -> 2).
  246. 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.metrics.exposeOperationTimes -> true).
  247. 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.client.tcpnodelay -> true).
  248. 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.rpc.max.size -> 52428800).
  249. 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.flush.size -> 134217728).
  250. 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.bulkload.retries.number -> 10).
  251. 2023-08-20T12:00:55,317 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.hlog.reader.impl -> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader).
  252. 2023-08-20T12:00:55,318 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.handler.abort.on.error.percent -> 0.5).
  253. 2023-08-20T12:00:55,318 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coordinated.state.manager.class -> org.apache.hadoop.hbase.coordination.ZkCoordinatedStateManager).
  254. 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.versionfile.writeattempts -> 3).
  255. 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.defaults.for.version -> 1.1.1).
  256. 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.hlog.writer.impl -> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogWriter).
  257. 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.leaderport -> 3888).
  258. 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.minWorkerThreads -> 16).
  259. 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.storescanner.parallel.seek.threads -> 10).
  260. 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.preclose.flush.size -> 5242880).
  261. 2023-08-20T12:00:55,319 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.info.port -> 16010).
  262. 2023-08-20T12:00:55,320 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.infoserver.redirect -> true).
  263. 2023-08-20T12:00:55,649 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.hfilecleaner.plugins -> org.apache.hadoop.hbase.master.cleaner.TimeToLiveHFileCleaner).
  264. 2023-08-20T12:00:55,650 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.regionSplitLimit -> 1000).
  265. 2023-08-20T12:00:55,650 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.timeout.period -> 60000).
  266. 2023-08-20T12:00:55,650 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rootdir -> /tmp/hbase-infernus/hbase).
  267. 2023-08-20T12:00:55,650 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.majorcompaction -> 604800000).
  268. 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.logroll.period -> 3600000).
  269. 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compactionThreshold -> 3).
  270. 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.bytes.per.checksum -> 16384).
  271. 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.client.fallback-to-simple-auth-allowed -> false).
  272. 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.perregion.tasks -> 1).
  273. 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.max.filesize -> 10737418240).
  274. 2023-08-20T12:00:55,654 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.region.split.policy -> org.apache.hadoop.hbase.regionserver.IncreasingToUpperBoundRegionSplitPolicy).
  275. 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.thread.wakefrequency -> 10000).
  276. 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.majorcompaction.jitter -> 0.50).
  277. 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.abortonerror -> true).
  278. 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.logcleaner.ttl -> 600000).
  279. 2023-08-20T12:00:55,655 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load spark property from hive configuration (spark.master -> spark://localhost:4040).
  280. 2023-08-20T12:00:55,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.parent -> /hbase).
  281. 2023-08-20T12:00:55,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.scan.ratio -> 0).
  282. 2023-08-20T12:00:55,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.rpc.threads -> 8).
  283. 2023-08-20T12:00:55,656 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.framed.max_frame_size_in_mb -> 2).
  284. 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.filter.classes -> org.apache.hadoop.hbase.rest.filter.GzipFilter).
  285. 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.port -> 16000).
  286. 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.connect.timeout -> 9000000).
  287. 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.dynamic.jars.dir -> /tmp/hbase-infernus/hbase/lib).
  288. 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.catalog.timeout -> 600000).
  289. 2023-08-20T12:00:55,657 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.blockingStoreFiles -> 10).
  290. 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.checksum.algorithm -> CRC32).
  291. 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.optionalcacheflushinterval -> 3600000).
  292. 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rpc.timeout -> 60000).
  293. 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.clientPort -> 2181).
  294. 2023-08-20T12:00:55,658 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.syncLimit -> 5).
  295. 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.catalog.timeout -> 600000).
  296. 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.write.buffer -> 2097152).
  297. 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.restore.take.failsafe.snapshot -> true).
  298. 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.bulkload.staging.dir -> /user/infernus/hbase-staging).
  299. 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.rootserver -> root-region-server).
  300. 2023-08-20T12:00:55,659 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.port -> 8080).
  301. 2023-08-20T12:00:55,660 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rootdir.perms -> 700).
  302. 2023-08-20T12:00:55,660 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.localityCheck.threadPoolSize -> 2).
  303. 2023-08-20T12:00:55,660 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.auth.token.max.lifetime -> 604800000).
  304. 2023-08-20T12:00:55,660 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.dns.nameserver -> default).
  305. 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.enabled -> true).
  306. 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.cells.scanned.per.heartbeat.check -> 10000).
  307. 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load spark property from hive configuration (spark.submit.deployMode -> cluster).
  308. 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regions.slop -> 0.2).
  309. 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.useMulti -> true).
  310. 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.staticuser.user -> dr.stack).
  311. 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.table.max.rowsize -> 1073741824).
  312. 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.blockingWaitTime -> 90000).
  313. 2023-08-20T12:00:55,661 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.quorum -> localhost).
  314. 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rpc.shortoperation.timeout -> 10000).
  315. 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.framed -> false).
  316. 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.tmp.dir -> /tmp/hbase-infernus).
  317. 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.initLimit -> 10).
  318. 2023-08-20T12:00:55,662 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.thrift.compact -> false).
  319. 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.info.bindAddress -> 0.0.0.0).
  320. 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.lease.recovery.timeout -> 900000).
  321. 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.maxClientCnxns -> 300).
  322. 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.online.schema.update.enable -> true).
  323. 2023-08-20T12:00:55,663 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.exec.permission.checks -> false).
  324. 2023-08-20T12:00:55,664 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.peerport -> 2888).
  325. 2023-08-20T12:00:55,664 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.dfs.client.read.shortcircuit.buffer.size -> 131072).
  326. 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.region.replica.replication.enabled -> false).
  327. 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.readonly -> false).
  328. 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.config.read.zookeeper.config -> false).
  329. 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.snapshot.restore.failsafe.name -> hbase-failsafe-{snapshot.name}-{restore.timestamp}).
  330. 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.max.result.size -> 2097152).
  331. 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.metrics.showTableName -> true).
  332. 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.filter.initializers -> org.apache.hadoop.hbase.http.lib.StaticUserWebFilter).
  333. 2023-08-20T12:00:55,665 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.perserver.tasks -> 5).
  334. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.threads.min -> 2).
  335. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.keyvalue.maxsize -> 10485760).
  336. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.storescanner.parallel.seek.enable -> false).
  337. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.secret.bits -> 256).
  338. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.column.max.version -> 1).
  339. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.loadbalancer.class -> org.apache.hadoop.hbase.master.balancer.StochasticLoadBalancer).
  340. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.dns.interface -> default).
  341. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.handler.count -> 30).
  342. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.defaults.for.version.skip -> false).
  343. 2023-08-20T12:00:55,666 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.compactchecker.interval.multiplier -> 1000).
  344. 2023-08-20T12:00:55,667 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.distributed.log.replay -> false).
  345. 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.port -> 16030).
  346. 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.storefile.refresh.period -> 0).
  347. 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.threads.max -> 100).
  348. 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.balancer.period -> 300000).
  349. 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.bindAddress -> 0.0.0.0).
  350. 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.logcleaner.plugins -> org.apache.hadoop.hbase.master.cleaner.TimeToLiveLogCleaner).
  351. 2023-08-20T12:00:55,670 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.authentication -> simple).
  352. 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.auth.key.update.interval -> 86400000).
  353. 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rest.support.proxyuser -> false).
  354. 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.cluster.distributed -> false).
  355. 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.fs.tmp.dir -> /user/infernus/hbase-staging).
  356. 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.scanner.max.result.size -> 104857600).
  357. 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.maxQueuedRequests -> 1000).
  358. 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.max.total.tasks -> 100).
  359. 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.info.port.auto -> false).
  360. 2023-08-20T12:00:55,671 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.server.connect.timeout -> 90000).
  361. 2023-08-20T12:00:55,672 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.mslab.enabled -> true).
  362. 2023-08-20T12:00:55,672 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.port -> 16020).
  363. 2023-08-20T12:00:55,672 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.dns.interface -> default).
  364. 2023-08-20T12:00:55,673 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.lease.recovery.dfs.timeout -> 64000).
  365. 2023-08-20T12:00:55,673 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.table.lock.enable -> true).
  366. 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.percolumnfamilyflush.size.lower.bound -> 16777216).
  367. 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.read.ratio -> 0).
  368. 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.logroll.errors.tolerated -> 2).
  369. 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.http.max.threads -> 10).
  370. 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.multicast.address.ip -> 226.1.1.3).
  371. 2023-08-20T12:00:55,675 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.dns.nameserver -> default).
  372. 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.multicast.address.port -> 16100).
  373. 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.publisher.class -> org.apache.hadoop.hbase.master.ClusterStatusPublisher$MulticastPublisher).
  374. 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.published -> false).
  375. 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.retries.number -> 35).
  376. 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.status.listener.class -> org.apache.hadoop.hbase.client.ClusterStatusListener$MulticastListener).
  377. 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.scanner.caching -> 2147483647).
  378. 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.property.dataDir -> /tmp/hbase-infernus/zookeeper).
  379. 2023-08-20T12:00:55,676 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.local.dir -> /tmp/hbase-infernus/local/).
  380. 2023-08-20T12:00:55,677 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.security.visibility.mutations.checkauths -> false).
  381. 2023-08-20T12:00:55,677 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.msginterval -> 3000).
  382. 2023-08-20T12:00:55,677 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (zookeeper.znode.acl.parent -> acl).
  383. 2023-08-20T12:00:55,677 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.client.pause -> 100).
  384. 2023-08-20T12:00:55,678 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.rs.cacheblocksonwrite -> false).
  385. 2023-08-20T12:00:55,678 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.enabled -> true).
  386. 2023-08-20T12:00:55,678 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.data.umask.enable -> false).
  387. 2023-08-20T12:00:55,678 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.checksum.verify -> true).
  388. 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.server.callqueue.handler.factor -> 0.1).
  389. 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.block.multiplier -> 4).
  390. 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.time.to.purge.deletes -> 0).
  391. 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.htablepool.size.max -> 1000).
  392. 2023-08-20T12:00:55,679 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.replication.rpc.codec -> org.apache.hadoop.hbase.codec.KeyValueCodecWithTags).
  393. 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compaction.kv.max -> 10).
  394. 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.compaction.max -> 10).
  395. 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.maxWorkerThreads -> 1000).
  396. 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.data.umask -> 000).
  397. 2023-08-20T12:00:55,680 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coprocessor.user.enabled -> true).
  398. 2023-08-20T12:00:55,682 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hstore.flusher.count -> 2).
  399. 2023-08-20T12:00:55,683 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.metrics.exposeOperationTimes -> true).
  400. 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.ipc.client.tcpnodelay -> true).
  401. 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load RPC property from hive configuration (hive.spark.client.rpc.max.size -> 52428800).
  402. 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.memstore.flush.size -> 134217728).
  403. 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.bulkload.retries.number -> 10).
  404. 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.hlog.reader.impl -> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader).
  405. 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.handler.abort.on.error.percent -> 0.5).
  406. 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.coordinated.state.manager.class -> org.apache.hadoop.hbase.coordination.ZkCoordinatedStateManager).
  407. 2023-08-20T12:00:55,684 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.server.versionfile.writeattempts -> 3).
  408. 2023-08-20T12:00:55,685 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.defaults.for.version -> 1.1.1).
  409. 2023-08-20T12:00:55,685 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.regionserver.hlog.writer.impl -> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogWriter).
  410. 2023-08-20T12:00:55,685 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.zookeeper.leaderport -> 3888).
  411. 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.thrift.minWorkerThreads -> 16).
  412. 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.storescanner.parallel.seek.threads -> 10).
  413. 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.hregion.preclose.flush.size -> 5242880).
  414. 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.info.port -> 16010).
  415. 2023-08-20T12:00:55,687 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.HiveSparkClientFactory: load HBase configuration (hbase.master.infoserver.redirect -> true).
  416. 2023-08-20T12:00:56,164 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] client.SparkClientImpl: Running client driver with argv: /home/infernus/spark-3.3.2-bin-hadoop3/bin/spark-submit --properties-file /tmp/spark-submit.7742744567511501757.properties --class org.apache.hive.spark.client.RemoteDriver /home/infernus/hive-2.3.9/lib/hive-exec-2.3.9.jar --remote-host infernuspc --remote-port 39847 --conf hive.spark.client.connect.timeout=9000000 --conf hive.spark.client.server.connect.timeout=90000 --conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 --conf hive.spark.client.rpc.server.address=null
  417. 2023-08-20T12:01:00,981 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
  418. 2023-08-20T12:01:00,984 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
  419. 2023-08-20T12:01:00,984 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
  420. 2023-08-20T12:01:00,984 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
  421. 2023-08-20T12:01:00,984 INFO [stderr-redir-1] client.SparkClientImpl: Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
  422. 2023-08-20T12:01:01,316 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:01 WARN Utils: Your hostname, infernuspc resolves to a loopback address: 127.0.1.1; using 10.0.3.15 instead (on interface enp0s8)
  423. 2023-08-20T12:01:01,323 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:01 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
  424. 2023-08-20T12:01:02,982 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  425. 2023-08-20T12:01:03,132 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: Changing view acls to: infernus
  426. 2023-08-20T12:01:03,134 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: Changing modify acls to: infernus
  427. 2023-08-20T12:01:03,135 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: Changing view acls groups to:
  428. 2023-08-20T12:01:03,139 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: Changing modify acls groups to:
  429. 2023-08-20T12:01:03,141 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(infernus); groups with view permissions: Set(); users with modify permissions: Set(infernus); groups with modify permissions: Set()
  430. 2023-08-20T12:01:04,135 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 INFO Utils: Successfully started service 'driverClient' on port 42723.
  431. 2023-08-20T12:01:04,312 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 INFO TransportClientFactory: Successfully created connection to localhost/127.0.0.1:4040 after 91 ms (0 ms spent in bootstraps)
  432. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 WARN TransportChannelHandler: Exception in connection from localhost/127.0.0.1:4040
  433. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  434. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  435. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  436. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  437. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  438. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  439. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  440. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  441. 2023-08-20T12:01:04,378 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  442. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  443. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  444. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  445. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  446. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  447. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  448. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  449. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  450. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  451. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  452. 2023-08-20T12:01:04,379 INFO [stdout-redir-1] client.SparkClientImpl: at java.lang.Thread.run(Thread.java:750)
  453. 2023-08-20T12:01:04,381 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from localhost/127.0.0.1:4040 is closed
  454. 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
  455. 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  456. 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  457. 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
  458. 2023-08-20T12:01:04,388 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
  459. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
  460. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  461. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
  462. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
  463. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
  464. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  465. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  466. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
  467. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
  468. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  469. 2023-08-20T12:01:04,390 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  470. 2023-08-20T12:01:04,391 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  471. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  472. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  473. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  474. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  475. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  476. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  477. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  478. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  479. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  480. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  481. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  482. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  483. 2023-08-20T12:01:04,392 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  484. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  485. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  486. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  487. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  488. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  489. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  490. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  491. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  492. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  493. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  494. 2023-08-20T12:01:04,393 INFO [stderr-redir-1] client.SparkClientImpl: at java.lang.Thread.run(Thread.java:750)
  495. 2023-08-20T12:01:04,418 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 INFO ShutdownHookManager: Shutdown hook called
  496. 2023-08-20T12:01:04,420 INFO [stdout-redir-1] client.SparkClientImpl: 23/08/20 12:01:04 INFO ShutdownHookManager: Deleting directory /tmp/spark-15d0e454-792d-44a3-879c-7189721907ad
  497. 2023-08-20T12:01:04,861 ERROR [f0fd81a8-0d73-43d0-814e-bda0253c132a main] client.SparkClientImpl: Error while waiting for client to connect.
  498. java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
  499. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
  500. Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
  501. Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
  502. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
  503. Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
  504. at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  505. at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  506. at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
  507. at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
  508. at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
  509. at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  510. at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
  511. at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
  512. at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
  513. at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  514. at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  515. at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
  516. at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
  517. at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  518. at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  519. at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  520. at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  521. at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  522. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  523. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  524. Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  525. at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  526. at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  527. at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  528. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  529. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  530. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  531. at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  532. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  533. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  534. at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  535. at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  536. at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  537. at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  538. at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  539. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  540. at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  541. at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  542. at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  543. at java.lang.Thread.run(Thread.java:750)
  544.  
  545. at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]
  546. at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:109) ~[hive-exec-2.3.9.jar:2.3.9]
  547. at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.9.jar:2.3.9]
  548. at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.9.jar:2.3.9]
  549. at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.9.jar:2.3.9]
  550. at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.9.jar:2.3.9]
  551. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.9.jar:2.3.9]
  552. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-2.3.9.jar:2.3.9]
  553. at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:126) ~[hive-exec-2.3.9.jar:2.3.9]
  554. at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:103) ~[hive-exec-2.3.9.jar:2.3.9]
  555. at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) ~[hive-exec-2.3.9.jar:2.3.9]
  556. at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) ~[hive-exec-2.3.9.jar:2.3.9]
  557. at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183) ~[hive-exec-2.3.9.jar:2.3.9]
  558. at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839) ~[hive-exec-2.3.9.jar:2.3.9]
  559. at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526) ~[hive-exec-2.3.9.jar:2.3.9]
  560. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) ~[hive-exec-2.3.9.jar:2.3.9]
  561. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227) ~[hive-exec-2.3.9.jar:2.3.9]
  562. at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) ~[hive-cli-2.3.9.jar:2.3.9]
  563. at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) ~[hive-cli-2.3.9.jar:2.3.9]
  564. at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) ~[hive-cli-2.3.9.jar:2.3.9]
  565. at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-2.3.9.jar:2.3.9]
  566. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-2.3.9.jar:2.3.9]
  567. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) ~[hive-cli-2.3.9.jar:2.3.9]
  568. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_382]
  569. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_382]
  570. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_382]
  571. at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_382]
  572. at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.3.1.jar:?]
  573. at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.3.1.jar:?]
  574. Caused by: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
  575. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
  576. Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
  577. Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
  578. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
  579. Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
  580. at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  581. at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  582. at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
  583. at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
  584. at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
  585. at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  586. at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
  587. at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
  588. at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
  589. at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  590. at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  591. at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
  592. at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
  593. at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  594. at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  595. at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  596. at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  597. at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  598. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  599. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  600. Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  601. at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  602. at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  603. at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  604. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  605. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  606. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  607. at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  608. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  609. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  610. at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  611. at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  612. at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  613. at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  614. at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  615. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  616. at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  617. at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  618. at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  619. at java.lang.Thread.run(Thread.java:750)
  620.  
  621. at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:212) ~[hive-exec-2.3.9.jar:2.3.9]
  622. at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:503) ~[hive-exec-2.3.9.jar:2.3.9]
  623. at java.lang.Thread.run(Thread.java:750) [?:1.8.0_382]
  624. 2023-08-20T12:01:04,861 WARN [Driver] client.SparkClientImpl: Child process exited with code 1
  625. 2023-08-20T12:01:04,896 ERROR [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
  626. org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.
  627. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:64)
  628. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
  629. at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:126)
  630. at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:103)
  631. at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)
  632. at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
  633. at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)
  634. at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)
  635. at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)
  636. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
  637. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
  638. at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233)
  639. at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
  640. at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
  641. at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
  642. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
  643. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
  644. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  645. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  646. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  647. at java.lang.reflect.Method.invoke(Method.java:498)
  648. at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
  649. at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
  650. Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
  651. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
  652. Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
  653. Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
  654. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
  655. Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
  656. at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  657. at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  658. at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
  659. at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
  660. at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
  661. at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  662. at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
  663. at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
  664. at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
  665. at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  666. at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  667. at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
  668. at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
  669. at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  670. at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  671. at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  672. at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  673. at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  674. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  675. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  676. Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  677. at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  678. at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  679. at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  680. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  681. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  682. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  683. at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  684. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  685. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  686. at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  687. at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  688. at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  689. at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  690. at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  691. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  692. at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  693. at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  694. at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  695. at java.lang.Thread.run(Thread.java:750)
  696.  
  697. at org.apache.hive.com.google.common.base.Throwables.propagate(Throwables.java:160)
  698. at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:125)
  699. at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80)
  700. at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101)
  701. at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:97)
  702. at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73)
  703. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62)
  704. ... 22 more
  705. Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
  706. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
  707. Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
  708. Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
  709. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
  710. Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
  711. at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  712. at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  713. at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
  714. at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
  715. at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
  716. at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  717. at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
  718. at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
  719. at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
  720. at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  721. at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  722. at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
  723. at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
  724. at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  725. at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  726. at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  727. at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  728. at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  729. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  730. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  731. Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  732. at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  733. at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  734. at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  735. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  736. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  737. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  738. at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  739. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  740. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  741. at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  742. at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  743. at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  744. at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  745. at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  746. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  747. at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  748. at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  749. at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  750. at java.lang.Thread.run(Thread.java:750)
  751.  
  752. at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41)
  753. at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:109)
  754. ... 27 more
  755. Caused by: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
  756. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
  757. Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
  758. Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
  759. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
  760. Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
  761. at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  762. at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  763. at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
  764. at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
  765. at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
  766. at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  767. at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
  768. at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
  769. at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
  770. at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  771. at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  772. at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
  773. at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
  774. at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  775. at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  776. at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  777. at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  778. at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  779. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  780. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  781. Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  782. at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  783. at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  784. at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  785. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  786. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  787. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  788. at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  789. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  790. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  791. at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  792. at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  793. at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  794. at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  795. at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  796. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  797. at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  798. at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  799. at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  800. at java.lang.Thread.run(Thread.java:750)
  801.  
  802. at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:212)
  803. at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:503)
  804. at java.lang.Thread.run(Thread.java:750)
  805.  
  806. 2023-08-20T12:01:04,896 ERROR [f0fd81a8-0d73-43d0-814e-bda0253c132a main] spark.SparkTask: Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
  807. org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client.
  808. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:64) ~[hive-exec-2.3.9.jar:2.3.9]
  809. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115) ~[hive-exec-2.3.9.jar:2.3.9]
  810. at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:126) ~[hive-exec-2.3.9.jar:2.3.9]
  811. at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:103) ~[hive-exec-2.3.9.jar:2.3.9]
  812. at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) ~[hive-exec-2.3.9.jar:2.3.9]
  813. at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) ~[hive-exec-2.3.9.jar:2.3.9]
  814. at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183) ~[hive-exec-2.3.9.jar:2.3.9]
  815. at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839) ~[hive-exec-2.3.9.jar:2.3.9]
  816. at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526) ~[hive-exec-2.3.9.jar:2.3.9]
  817. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237) ~[hive-exec-2.3.9.jar:2.3.9]
  818. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227) ~[hive-exec-2.3.9.jar:2.3.9]
  819. at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:233) ~[hive-cli-2.3.9.jar:2.3.9]
  820. at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184) ~[hive-cli-2.3.9.jar:2.3.9]
  821. at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) ~[hive-cli-2.3.9.jar:2.3.9]
  822. at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821) ~[hive-cli-2.3.9.jar:2.3.9]
  823. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759) ~[hive-cli-2.3.9.jar:2.3.9]
  824. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686) ~[hive-cli-2.3.9.jar:2.3.9]
  825. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_382]
  826. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_382]
  827. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_382]
  828. at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_382]
  829. at org.apache.hadoop.util.RunJar.run(RunJar.java:323) ~[hadoop-common-3.3.1.jar:?]
  830. at org.apache.hadoop.util.RunJar.main(RunJar.java:236) ~[hadoop-common-3.3.1.jar:?]
  831. Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
  832. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
  833. Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
  834. Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
  835. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
  836. Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
  837. at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  838. at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  839. at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
  840. at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
  841. at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
  842. at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  843. at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
  844. at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
  845. at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
  846. at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  847. at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  848. at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
  849. at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
  850. at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  851. at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  852. at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  853. at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  854. at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  855. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  856. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  857. Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  858. at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  859. at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  860. at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  861. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  862. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  863. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  864. at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  865. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  866. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  867. at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  868. at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  869. at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  870. at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  871. at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  872. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  873. at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  874. at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  875. at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  876. at java.lang.Thread.run(Thread.java:750)
  877.  
  878. at org.apache.hive.com.google.common.base.Throwables.propagate(Throwables.java:160) ~[hive-exec-2.3.9.jar:2.3.9]
  879. at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:125) ~[hive-exec-2.3.9.jar:2.3.9]
  880. at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.9.jar:2.3.9]
  881. at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.9.jar:2.3.9]
  882. at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.9.jar:2.3.9]
  883. at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.9.jar:2.3.9]
  884. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.9.jar:2.3.9]
  885. ... 22 more
  886. Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
  887. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
  888. Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
  889. Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
  890. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
  891. Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
  892. at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  893. at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  894. at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
  895. at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
  896. at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
  897. at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  898. at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
  899. at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
  900. at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
  901. at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  902. at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  903. at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
  904. at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
  905. at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  906. at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  907. at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  908. at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  909. at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  910. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  911. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  912. Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  913. at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  914. at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  915. at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  916. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  917. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  918. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  919. at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  920. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  921. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  922. at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  923. at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  924. at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  925. at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  926. at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  927. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  928. at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  929. at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  930. at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  931. at java.lang.Thread.run(Thread.java:750)
  932.  
  933. at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]
  934. at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:109) ~[hive-exec-2.3.9.jar:2.3.9]
  935. at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.9.jar:2.3.9]
  936. at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.9.jar:2.3.9]
  937. at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.9.jar:2.3.9]
  938. at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.9.jar:2.3.9]
  939. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.9.jar:2.3.9]
  940. ... 22 more
  941. Caused by: java.lang.RuntimeException: Cancel client 'cc1f8b8a-0bfd-4574-9b6d-90bb0238a71e'. Error: Child process exited before connecting back with error log Warning: Ignoring non-Spark config property: hive.spark.client.server.connect.timeout
  942. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.threads
  943. Warning: Ignoring non-Spark config property: hive.spark.client.connect.timeout
  944. Warning: Ignoring non-Spark config property: hive.spark.client.secret.bits
  945. Warning: Ignoring non-Spark config property: hive.spark.client.rpc.max.size
  946. Exception in thread "main" org.apache.spark.SparkException: Exception thrown in awaitResult:
  947. at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
  948. at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
  949. at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102)
  950. at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110)
  951. at org.apache.spark.deploy.ClientApp.$anonfun$start$2(Client.scala:292)
  952. at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  953. at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
  954. at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
  955. at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
  956. at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  957. at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  958. at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:198)
  959. at org.apache.spark.deploy.ClientApp.start(Client.scala:292)
  960. at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  961. at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  962. at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  963. at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  964. at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  965. at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  966. at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
  967. Caused by: java.lang.IllegalArgumentException: Too large frame: 5211883372140375593
  968. at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
  969. at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
  970. at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
  971. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  972. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  973. at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
  974. at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
  975. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
  976. at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
  977. at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
  978. at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
  979. at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
  980. at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
  981. at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
  982. at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
  983. at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
  984. at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
  985. at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
  986. at java.lang.Thread.run(Thread.java:750)
  987.  
  988. at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:212) ~[hive-exec-2.3.9.jar:2.3.9]
  989. at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:503) ~[hive-exec-2.3.9.jar:2.3.9]
  990. at java.lang.Thread.run(Thread.java:750) [?:1.8.0_382]
  991. 2023-08-20T12:01:04,897 ERROR [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create spark client.
  992. 2023-08-20T12:01:04,897 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] ql.Driver: Completed executing command(queryId=infernus_20230820120048_1fb1cd87-c4ed-4156-948c-4a3e42db0c06); Time taken: 9.801 seconds
  993. 2023-08-20T12:01:04,899 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] exec.ListSinkOperator: Closing operator LIST_SINK[7]
  994. 2023-08-20T12:01:04,984 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] conf.HiveConf: Using the default value passed in for log id: f0fd81a8-0d73-43d0-814e-bda0253c132a
  995. 2023-08-20T12:01:04,984 INFO [f0fd81a8-0d73-43d0-814e-bda0253c132a main] session.SessionState: Resetting thread name to main
  996.  
Tags: Error Logs
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement