vikram512700

Untitled

May 8th, 2019
155
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 50.42 KB | None | 0 0
  1. /usr/local/hadoop/libexec/hadoop-functions.sh: line 2399: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: invalid variable name
  2. /usr/local/hadoop/libexec/hadoop-functions.sh: line 2364: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: invalid variable name
  3. /usr/local/hadoop/libexec/hadoop-functions.sh: line 2459: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: invalid variable name
  4. 2019-05-07 05:21:13,209 INFO [main] sqoop.Sqoop: Running Sqoop version: 1.4.7
  5. Enter password:
  6. 2019-05-07 05:21:18,817 INFO [main] manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  7. 2019-05-07 05:21:18,860 INFO [main] tool.CodeGenTool: Beginning code generation
  8. SLF4J: Class path contains multiple SLF4J bindings.
  9. SLF4J: Found binding in [jar:file:/usr/local/hbase/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  10. SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  11. SLF4J: Found binding in [jar:file:/usr/local/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  12. SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
  13. Tue May 07 05:21:19 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  14. 2019-05-07 05:21:20,539 INFO [main] manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
  15. 2019-05-07 05:21:20,666 INFO [main] manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
  16. 2019-05-07 05:21:20,724 INFO [main] orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
  17. Note: /tmp/sqoop-root/compile/152d233f78e17bb1d8e0f57be9cc041f/employee.java uses or overrides a deprecated API.
  18. Note: Recompile with -Xlint:deprecation for details.
  19. 2019-05-07 05:21:28,574 INFO [main] orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/152d233f78e17bb1d8e0f57be9cc041f/employee.jar
  20. 2019-05-07 05:21:28,612 WARN [main] manager.MySQLManager: It looks like you are importing from mysql.
  21. 2019-05-07 05:21:28,612 WARN [main] manager.MySQLManager: This transfer can be faster! Use the --direct
  22. 2019-05-07 05:21:28,613 WARN [main] manager.MySQLManager: option to exercise a MySQL-specific fast path.
  23. 2019-05-07 05:21:28,617 INFO [main] manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
  24. 2019-05-07 05:21:28,631 INFO [main] mapreduce.ImportJobBase: Beginning import of employee
  25. 2019-05-07 05:21:29,337 INFO [main] Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
  26. 2019-05-07 05:21:31,595 INFO [main] Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
  27. 2019-05-07 05:21:32,050 INFO [main] client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
  28. 2019-05-07 05:21:34,542 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/parquet-avro-1.6.0.jar] hdfs.DFSClient: Caught exception
  29. java.lang.InterruptedException
  30. at java.lang.Object.wait(Native Method)
  31. at java.lang.Thread.join(Thread.java:1252)
  32. at java.lang.Thread.join(Thread.java:1326)
  33. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  34. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:476)
  35. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:652)
  36. 2019-05-07 05:21:34,608 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/libthrift-0.9.3.jar block BP-2014112863-127.0.1.1-1556278796917:blk_1073742257_1433] hdfs.DFSClient: Caught exception
  37. java.lang.InterruptedException
  38. at java.lang.Object.wait(Native Method)
  39. at java.lang.Thread.join(Thread.java:1252)
  40. at java.lang.Thread.join(Thread.java:1326)
  41. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  42. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:684)
  43. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:680)
  44. 2019-05-07 05:21:34,680 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/commons-logging-1.1.1.jar block BP-2014112863-127.0.1.1-1556278796917:blk_1073742258_1434] hdfs.DFSClient: Caught exception
  45. java.lang.InterruptedException
  46. at java.lang.Object.wait(Native Method)
  47. at java.lang.Thread.join(Thread.java:1252)
  48. at java.lang.Thread.join(Thread.java:1326)
  49. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  50. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:684)
  51. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:680)
  52. 2019-05-07 05:21:34,928 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/mysql-connector-java-5.1.47.jar] hdfs.DFSClient: Caught exception
  53. java.lang.InterruptedException
  54. at java.lang.Object.wait(Native Method)
  55. at java.lang.Thread.join(Thread.java:1252)
  56. at java.lang.Thread.join(Thread.java:1326)
  57. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  58. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:476)
  59. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:652)
  60. 2019-05-07 05:21:35,005 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/jackson-databind-2.3.1.jar block BP-2014112863-127.0.1.1-1556278796917:blk_1073742263_1439] hdfs.DFSClient: Caught exception
  61. java.lang.InterruptedException
  62. at java.lang.Object.wait(Native Method)
  63. at java.lang.Thread.join(Thread.java:1252)
  64. at java.lang.Thread.join(Thread.java:1326)
  65. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  66. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:684)
  67. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:680)
  68. 2019-05-07 05:21:35,074 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/parquet-column-1.6.0.jar block BP-2014112863-127.0.1.1-1556278796917:blk_1073742264_1440] hdfs.DFSClient: Caught exception
  69. java.lang.InterruptedException
  70. at java.lang.Object.wait(Native Method)
  71. at java.lang.Thread.join(Thread.java:1252)
  72. at java.lang.Thread.join(Thread.java:1326)
  73. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  74. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:684)
  75. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:680)
  76. 2019-05-07 05:21:35,413 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/kite-data-hive-1.1.0.jar] hdfs.DFSClient: Caught exception
  77. java.lang.InterruptedException
  78. at java.lang.Object.wait(Native Method)
  79. at java.lang.Thread.join(Thread.java:1252)
  80. at java.lang.Thread.join(Thread.java:1326)
  81. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  82. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:476)
  83. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:652)
  84. 2019-05-07 05:21:35,567 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/opencsv-2.3.jar] hdfs.DFSClient: Caught exception
  85. java.lang.InterruptedException
  86. at java.lang.Object.wait(Native Method)
  87. at java.lang.Thread.join(Thread.java:1252)
  88. at java.lang.Thread.join(Thread.java:1326)
  89. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  90. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:476)
  91. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:652)
  92. 2019-05-07 05:21:36,027 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/parquet-encoding-1.6.0.jar] hdfs.DFSClient: Caught exception
  93. java.lang.InterruptedException
  94. at java.lang.Object.wait(Native Method)
  95. at java.lang.Thread.join(Thread.java:1252)
  96. at java.lang.Thread.join(Thread.java:1326)
  97. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  98. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:476)
  99. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:652)
  100. 2019-05-07 05:21:36,267 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/slf4j-api-1.6.1.jar] hdfs.DFSClient: Caught exception
  101. java.lang.InterruptedException
  102. at java.lang.Object.wait(Native Method)
  103. at java.lang.Thread.join(Thread.java:1252)
  104. at java.lang.Thread.join(Thread.java:1326)
  105. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  106. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:476)
  107. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:652)
  108. 2019-05-07 05:21:36,350 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/ant-contrib-1.0b3.jar] hdfs.DFSClient: Caught exception
  109. java.lang.InterruptedException
  110. at java.lang.Object.wait(Native Method)
  111. at java.lang.Thread.join(Thread.java:1252)
  112. at java.lang.Thread.join(Thread.java:1326)
  113. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  114. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:476)
  115. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:652)
  116. 2019-05-07 05:21:36,490 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/libjars/snappy-java-1.1.1.6.jar] hdfs.DFSClient: Caught exception
  117. java.lang.InterruptedException
  118. at java.lang.Object.wait(Native Method)
  119. at java.lang.Thread.join(Thread.java:1252)
  120. at java.lang.Thread.join(Thread.java:1326)
  121. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  122. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:476)
  123. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:652)
  124. Tue May 07 05:21:36 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  125. 2019-05-07 05:21:36,974 INFO [main] db.DBInputFormat: Using read commited transaction isolation
  126. 2019-05-07 05:21:36,975 INFO [main] db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM `employee`
  127. 2019-05-07 05:21:36,995 INFO [main] db.IntegerSplitter: Split size: 1; Num splits: 4 from: 1201 to: 1205
  128. 2019-05-07 05:21:37,151 WARN [DataStreamer for file /tmp/hadoop-yarn/staging/root/.staging/job_1557223071171_0009/job.splitmetainfo block BP-2014112863-127.0.1.1-1556278796917:blk_1073742292_1468] hdfs.DFSClient: Caught exception
  129. java.lang.InterruptedException
  130. at java.lang.Object.wait(Native Method)
  131. at java.lang.Thread.join(Thread.java:1252)
  132. at java.lang.Thread.join(Thread.java:1326)
  133. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:716)
  134. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeInternal(DFSOutputStream.java:684)
  135. at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:680)
  136. 2019-05-07 05:21:37,152 INFO [main] mapreduce.JobSubmitter: number of splits:5
  137. 2019-05-07 05:21:37,450 INFO [main] mapreduce.JobSubmitter: Submitting tokens for job: job_1557223071171_0009
  138. 2019-05-07 05:21:38,537 INFO [main] impl.YarnClientImpl: Submitted application application_1557223071171_0009
  139. 2019-05-07 05:21:38,682 INFO [main] mapreduce.Job: The url to track the job: http://ubuntu:8088/proxy/application_1557223071171_0009/
  140. 2019-05-07 05:21:38,683 INFO [main] mapreduce.Job: Running job: job_1557223071171_0009
  141. 2019-05-07 05:21:58,166 INFO [main] mapreduce.Job: Job job_1557223071171_0009 running in uber mode : false
  142. 2019-05-07 05:21:58,169 INFO [main] mapreduce.Job: map 0% reduce 0%
  143. 2019-05-07 05:22:57,871 INFO [main] mapreduce.Job: map 20% reduce 0%
  144. 2019-05-07 05:22:58,922 INFO [main] mapreduce.Job: map 40% reduce 0%
  145. 2019-05-07 05:22:59,930 INFO [main] mapreduce.Job: map 80% reduce 0%
  146. 2019-05-07 05:23:00,967 INFO [main] mapreduce.Job: map 100% reduce 0%
  147. 2019-05-07 05:23:01,059 INFO [main] mapreduce.Job: Job job_1557223071171_0009 completed successfully
  148. 2019-05-07 05:23:01,466 WARN [main] counters.FileSystemCounterGroup: HDFS_BYTES_READ_EC is not a recognized counter.
  149. 2019-05-07 05:23:01,545 WARN [main] counters.FrameworkCounterGroup: MAP_PHYSICAL_MEMORY_BYTES_MAX is not a recognized counter.
  150. 2019-05-07 05:23:01,545 WARN [main] counters.FrameworkCounterGroup: MAP_VIRTUAL_MEMORY_BYTES_MAX is not a recognized counter.
  151. 2019-05-07 05:23:01,635 INFO [main] mapreduce.Job: =0
  152. HDFS: Number of bytes read=521
  153. HDFS: Number of bytes written=50
  154. HDFS: Number of read operations=20
  155. HDFS: Number of large read operations=0
  156. HDFS: Number of write operations=10
  157. Job Counters
  158. Killed map tasks=1
  159. Launched map tasks=5
  160. Other local map tasks=5
  161. Total time spent by all maps in occupied slots (ms)=282829
  162. Total time spent by all reduces in occupied slots (ms)=0
  163. Total time spent by all map tasks (ms)=282829
  164. Total vcore-milliseconds taken by all map tasks=282829
  165. Total megabyte-milliseconds taken by all map tasks=289616896
  166. Map-Reduce Framework
  167. Map input records=4
  168. Map output records=4
  169. Input split bytes=521
  170. Spilled Records=0
  171. Failed Shuffles=0
  172. Merged Map outputs=0
  173. GC time elapsed (ms)=4574
  174. CPU time spent (ms)=11550
  175. Physical memory (bytes) snapshot=655110144
  176. Virtual memory (bytes) snapshot=9562136576
  177. Total committed heap usage (bytes)=385679360
  178. File Input Format Counters
  179. Bytes Read=0
  180. File Output Format Counters
  181. Bytes Written=50
  182. 2019-05-07 05:23:01,644 WARN [main] counters.FileSystemCounterGroup: HDFS_BYTES_READ_EC is not a recognized counter.
  183. 2019-05-07 05:23:01,645 WARN [main] counters.FrameworkCounterGroup: MAP_PHYSICAL_MEMORY_BYTES_MAX is not a recognized counter.
  184. 2019-05-07 05:23:01,645 WARN [main] counters.FrameworkCounterGroup: MAP_VIRTUAL_MEMORY_BYTES_MAX is not a recognized counter.
  185. 2019-05-07 05:23:01,646 INFO [main] mapreduce.ImportJobBase: Transferred 50 bytes in 90.0107 seconds (0.5555 bytes/sec)
  186. 2019-05-07 05:23:01,688 WARN [main] counters.FileSystemCounterGroup: HDFS_BYTES_READ_EC is not a recognized counter.
  187. 2019-05-07 05:23:01,690 WARN [main] counters.FrameworkCounterGroup: MAP_PHYSICAL_MEMORY_BYTES_MAX is not a recognized counter.
  188. 2019-05-07 05:23:01,692 WARN [main] counters.FrameworkCounterGroup: MAP_VIRTUAL_MEMORY_BYTES_MAX is not a recognized counter.
  189. 2019-05-07 05:23:01,693 INFO [main] mapreduce.ImportJobBase: Retrieved 4 records.
  190. 2019-05-07 05:23:01,693 INFO [main] mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table employee
  191. Tue May 07 05:23:01 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  192. 2019-05-07 05:23:02,011 INFO [main] manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
  193. 2019-05-07 05:23:02,029 INFO [main] hive.HiveImport: Loading uploaded data into Hive
  194. 2019-05-07 05:23:02,158 INFO [main] conf.HiveConf: Found configuration file file:/usr/local/hive/conf/hive-site.xml
  195. 2019-05-07 05:23:10,859 main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
  196. at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
  197. at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)
  198. at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)
  199. at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)
  200. at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
  201. at org.apache.logging.log4j.core.jmx.Server.register(Server.java:389)
  202. at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:167)
  203. at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:140)
  204. at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:556)
  205. at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:261)
  206. at org.apache.logging.log4j.core.async.AsyncLoggerContext.start(AsyncLoggerContext.java:87)
  207. at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:240)
  208. at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:158)
  209. at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:131)
  210. at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:101)
  211. at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:188)
  212. at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:173)
  213. at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:106)
  214. at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:98)
  215. at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:81)
  216. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
  217. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
  218. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  219. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  220. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  221. at java.lang.reflect.Method.invoke(Method.java:498)
  222. at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
  223. at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
  224. at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
  225. at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
  226. at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
  227. at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
  228. at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
  229. at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
  230. at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
  231. at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
  232.  
  233. Hive Session ID = bb53ebb3-6291-458c-a869-398e88041cb5
  234. 2019-05-07 05:23:11,274 INFO [main] SessionState: Hive Session ID = bb53ebb3-6291-458c-a869-398e88041cb5
  235.  
  236. Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-3.1.0.jar!/hive-log4j2.properties Async: true
  237. 2019-05-07 05:23:11,695 INFO [main] SessionState:
  238. Logging initialized using configuration in jar:file:/usr/local/hive/lib/hive-common-3.1.0.jar!/hive-log4j2.properties Async: true
  239. 2019-05-07 05:23:11,807 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/root/bb53ebb3-6291-458c-a869-398e88041cb5
  240. 2019-05-07 05:23:12,174 INFO [main] session.SessionState: Created local directory: /tmp/hive/bb53ebb3-6291-458c-a869-398e88041cb5
  241. 2019-05-07 05:23:12,178 INFO [main] session.SessionState: Created HDFS directory: /tmp/hive/root/bb53ebb3-6291-458c-a869-398e88041cb5/_tmp_space.db
  242. 2019-05-07 05:23:12,199 INFO [main] conf.HiveConf: Using the default value passed in for log id: bb53ebb3-6291-458c-a869-398e88041cb5
  243. 2019-05-07 05:23:12,210 INFO [main] session.SessionState: Updating thread name to bb53ebb3-6291-458c-a869-398e88041cb5 main
  244. 2019-05-07 05:23:18,253 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
  245. 2019-05-07 05:23:18,380 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
  246. 2019-05-07 05:23:18,407 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: ObjectStore, initialize called
  247. 2019-05-07 05:23:18,436 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] conf.MetastoreConf: Found configuration file file:/usr/local/hive/conf/hive-site.xml
  248. 2019-05-07 05:23:18,458 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] conf.MetastoreConf: Unable to find config file hivemetastore-site.xml
  249. 2019-05-07 05:23:18,458 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] conf.MetastoreConf: Found configuration file null
  250. 2019-05-07 05:23:18,467 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] conf.MetastoreConf: Unable to find config file metastore-site.xml
  251. 2019-05-07 05:23:18,467 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] conf.MetastoreConf: Found configuration file null
  252. 2019-05-07 05:23:20,362 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
  253. 2019-05-07 05:23:22,067 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] hikari.HikariDataSource: HikariPool-1 - Starting...
  254. Tue May 07 05:23:22 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  255. 2019-05-07 05:23:22,408 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] hikari.HikariDataSource: HikariPool-1 - Start completed.
  256. Tue May 07 05:23:22 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  257. Tue May 07 05:23:22 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  258. Tue May 07 05:23:22 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  259. Tue May 07 05:23:22 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  260. Tue May 07 05:23:22 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  261. Tue May 07 05:23:22 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  262. Tue May 07 05:23:22 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  263. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  264. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  265. 2019-05-07 05:23:23,320 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] hikari.HikariDataSource: HikariPool-2 - Starting...
  266. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  267. 2019-05-07 05:23:23,339 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] hikari.HikariDataSource: HikariPool-2 - Start completed.
  268. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  269. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  270. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  271. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  272. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  273. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  274. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  275. Tue May 07 05:23:23 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  276. Tue May 07 05:23:24 PDT 2019 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
  277. 2019-05-07 05:23:24,304 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
  278. 2019-05-07 05:23:25,017 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
  279. 2019-05-07 05:23:25,022 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: Initialized ObjectStore
  280. 2019-05-07 05:23:26,382 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  281. 2019-05-07 05:23:26,391 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  282. 2019-05-07 05:23:26,391 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  283. 2019-05-07 05:23:26,392 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  284. 2019-05-07 05:23:26,392 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  285. 2019-05-07 05:23:26,392 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  286. 2019-05-07 05:23:32,683 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  287. 2019-05-07 05:23:32,684 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  288. 2019-05-07 05:23:32,690 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  289. 2019-05-07 05:23:32,690 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  290. 2019-05-07 05:23:32,691 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  291. 2019-05-07 05:23:32,693 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] DataNucleus.MetaData: Metadata has jdbc-type of null yet this is not valid. Ignored
  292. 2019-05-07 05:23:43,459 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: Added admin role in metastore
  293. 2019-05-07 05:23:43,481 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: Added public role in metastore
  294. 2019-05-07 05:23:43,621 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: No user is added in admin role, since config is empty
  295. 2019-05-07 05:23:46,015 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.RetryingMetaStoreClient: RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
  296. 2019-05-07 05:23:46,135 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: 0: get_all_functions
  297. 2019-05-07 05:23:46,155 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_all_functions
  298. 2019-05-07 05:23:46,306 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] conf.HiveConf: Using the default value passed in for log id: bb53ebb3-6291-458c-a869-398e88041cb5
  299. Hive Session ID = 6ce48e99-40f8-4e9c-a80d-c2574e1113b2
  300. 2019-05-07 05:23:46,320 INFO [pool-9-thread-1] SessionState: Hive Session ID = 6ce48e99-40f8-4e9c-a80d-c2574e1113b2
  301. 2019-05-07 05:23:46,379 INFO [pool-9-thread-1] session.SessionState: Created HDFS directory: /tmp/hive/root/6ce48e99-40f8-4e9c-a80d-c2574e1113b2
  302. 2019-05-07 05:23:46,383 INFO [pool-9-thread-1] session.SessionState: Created local directory: /tmp/hive/6ce48e99-40f8-4e9c-a80d-c2574e1113b2
  303. 2019-05-07 05:23:46,390 INFO [pool-9-thread-1] session.SessionState: Created HDFS directory: /tmp/hive/root/6ce48e99-40f8-4e9c-a80d-c2574e1113b2/_tmp_space.db
  304. 2019-05-07 05:23:46,404 INFO [pool-9-thread-1] metastore.HiveMetaStore: 1: get_databases: @hive#
  305. 2019-05-07 05:23:46,405 INFO [pool-9-thread-1] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_databases: @hive#
  306. 2019-05-07 05:23:46,411 INFO [pool-9-thread-1] metastore.HiveMetaStore: 1: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
  307. 2019-05-07 05:23:46,423 INFO [pool-9-thread-1] metastore.ObjectStore: ObjectStore, initialize called
  308. 2019-05-07 05:23:46,495 INFO [pool-9-thread-1] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
  309. 2019-05-07 05:23:46,501 INFO [pool-9-thread-1] metastore.ObjectStore: Initialized ObjectStore
  310. 2019-05-07 05:23:46,598 INFO [pool-9-thread-1] metastore.HiveMetaStore: 1: get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW
  311. 2019-05-07 05:23:46,600 INFO [pool-9-thread-1] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_tables_by_type: db=@hive#default pat=.*,type=MATERIALIZED_VIEW
  312. 2019-05-07 05:23:46,688 INFO [pool-9-thread-1] metastore.HiveMetaStore: 1: get_multi_table : db=default tbls=
  313. 2019-05-07 05:23:46,688 INFO [pool-9-thread-1] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_multi_table : db=default tbls=
  314. 2019-05-07 05:23:46,694 INFO [pool-9-thread-1] metastore.HiveMetaStore: 1: get_tables_by_type: db=@hive#hive_db pat=.*,type=MATERIALIZED_VIEW
  315. 2019-05-07 05:23:46,694 INFO [pool-9-thread-1] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_tables_by_type: db=@hive#hive_db pat=.*,type=MATERIALIZED_VIEW
  316. 2019-05-07 05:23:46,722 INFO [pool-9-thread-1] metastore.HiveMetaStore: 1: get_multi_table : db=hive_db tbls=
  317. 2019-05-07 05:23:46,722 INFO [pool-9-thread-1] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_multi_table : db=hive_db tbls=
  318. 2019-05-07 05:23:46,723 INFO [pool-9-thread-1] metadata.HiveMaterializedViewsRegistry: Materialized views registry has been initialized
  319. 2019-05-07 05:23:47,207 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Compiling command(queryId=root_20190507052346_f16b38f5-ae5f-4a87-80c9-c37c079b35e8): CREATE TABLE `hive_db`.`employee_sqoop` ( `id` INT, `name` STRING) COMMENT 'Imported by sqoop on 2019/05/07 05:23:02' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' LINES TERMINATED BY '\012' STORED AS TEXTFILE
  320. 2019-05-07 05:23:51,448 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
  321. 2019-05-07 05:23:51,487 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] parse.CalcitePlanner: Starting Semantic Analysis
  322. 2019-05-07 05:23:51,597 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] sqlstd.SQLStdHiveAccessController: Created SQLStdHiveAccessController for session context : HiveAuthzSessionContext [sessionString=bb53ebb3-6291-458c-a869-398e88041cb5, clientType=HIVECLI]
  323. 2019-05-07 05:23:51,607 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] session.SessionState: METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
  324. 2019-05-07 05:23:51,611 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStoreClient: Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl to org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook
  325. 2019-05-07 05:23:51,621 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
  326. 2019-05-07 05:23:51,622 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Cleaning up thread local RawStore...
  327. 2019-05-07 05:23:51,630 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
  328. 2019-05-07 05:23:51,634 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Done cleaning up thread local RawStore
  329. 2019-05-07 05:23:51,640 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
  330. 2019-05-07 05:23:51,641 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
  331. 2019-05-07 05:23:51,642 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: ObjectStore, initialize called
  332. 2019-05-07 05:23:51,705 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
  333. 2019-05-07 05:23:51,706 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: Initialized ObjectStore
  334. 2019-05-07 05:23:51,714 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.RetryingMetaStoreClient: RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
  335. 2019-05-07 05:23:51,861 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] parse.CalcitePlanner: Creating table hive_db.employee_sqoop position=13
  336. 2019-05-07 05:23:51,980 WARN [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored
  337. 2019-05-07 05:23:51,980 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: ObjectStore, initialize called
  338. 2019-05-07 05:23:52,005 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
  339. 2019-05-07 05:23:52,005 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.ObjectStore: Initialized ObjectStore
  340. 2019-05-07 05:23:52,023 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.RetryingMetaStoreClient: RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0
  341. 2019-05-07 05:23:52,037 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: 0: get_database: @hive#hive_db
  342. 2019-05-07 05:23:52,038 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_database: @hive#hive_db
  343. 2019-05-07 05:23:52,218 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Semantic Analysis Completed (retrial = false)
  344. 2019-05-07 05:23:52,310 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
  345. 2019-05-07 05:23:52,350 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Completed compiling command(queryId=root_20190507052346_f16b38f5-ae5f-4a87-80c9-c37c079b35e8); Time taken: 5.335 seconds
  346. 2019-05-07 05:23:52,353 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] reexec.ReExecDriver: Execution #1 of query
  347. 2019-05-07 05:23:52,359 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
  348. 2019-05-07 05:23:52,364 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Executing command(queryId=root_20190507052346_f16b38f5-ae5f-4a87-80c9-c37c079b35e8): CREATE TABLE `hive_db`.`employee_sqoop` ( `id` INT, `name` STRING) COMMENT 'Imported by sqoop on 2019/05/07 05:23:02' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' LINES TERMINATED BY '\012' STORED AS TEXTFILE
  349. 2019-05-07 05:23:52,452 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Starting task [Stage-0:DDL] in serial mode
  350. 2019-05-07 05:23:52,455 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStoreClient: Mestastore configuration metastore.filter.hook changed from org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook to org.apache.hadoop.hive.metastore.DefaultMetaStoreFilterHookImpl
  351. 2019-05-07 05:23:52,455 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: 0: Cleaning up thread local RawStore...
  352. 2019-05-07 05:23:52,455 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Cleaning up thread local RawStore...
  353. 2019-05-07 05:23:52,455 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] metastore.HiveMetaStore: 0: Done cleaning up thread local RawStore
  354. 2019-05-07 05:23:52,456 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Done cleaning up thread local RawStore
  355. 2019-05-07 05:23:52,789 ERROR [bb53ebb3-6291-458c-a869-398e88041cb5 main] exec.DDLTask: Failed
  356. java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;
  357. at org.apache.hadoop.hive.common.StatsSetupConst$ColumnStatsAccurate.<clinit>(StatsSetupConst.java:166)
  358. at org.apache.hadoop.hive.common.StatsSetupConst.parseStatsAcc(StatsSetupConst.java:314)
  359. at org.apache.hadoop.hive.common.StatsSetupConst.setBasicStatsState(StatsSetupConst.java:231)
  360. at org.apache.hadoop.hive.common.StatsSetupConst.setStatsStateForCreateTable(StatsSetupConst.java:306)
  361. at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:868)
  362. at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4913)
  363. at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:428)
  364. at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
  365. at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
  366. at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
  367. at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
  368. at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
  369. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
  370. at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
  371. at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
  372. at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
  373. at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
  374. at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
  375. at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
  376. at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:335)
  377. at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:471)
  378. at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:487)
  379. at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793)
  380. at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
  381. at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
  382. at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  383. at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  384. at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  385. at java.lang.reflect.Method.invoke(Method.java:498)
  386. at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
  387. at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
  388. at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
  389. at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
  390. at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
  391. at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
  392. at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
  393. at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
  394. at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
  395. at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
  396. 2019-05-07 05:23:52,799 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] reexec.ReOptimizePlugin: ReOptimization: retryPossible: false
  397. FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;
  398. 2019-05-07 05:23:52,802 ERROR [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;
  399. 2019-05-07 05:23:52,802 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Completed executing command(queryId=root_20190507052346_f16b38f5-ae5f-4a87-80c9-c37c079b35e8); Time taken: 0.443 seconds
  400. 2019-05-07 05:23:52,802 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] ql.Driver: Concurrency mode is disabled, not creating a lock manager
  401. 2019-05-07 05:23:52,819 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] conf.HiveConf: Using the default value passed in for log id: bb53ebb3-6291-458c-a869-398e88041cb5
  402. 2019-05-07 05:23:52,819 INFO [bb53ebb3-6291-458c-a869-398e88041cb5 main] session.SessionState: Resetting thread name to main
  403. 2019-05-07 05:23:52,821 INFO [main] conf.HiveConf: Using the default value passed in for log id: bb53ebb3-6291-458c-a869-398e88041cb5
  404. 2019-05-07 05:23:52,858 INFO [main] session.SessionState: Deleted directory: /tmp/hive/root/bb53ebb3-6291-458c-a869-398e88041cb5 on fs with scheme hdfs
  405. 2019-05-07 05:23:52,861 INFO [main] session.SessionState: Deleted directory: /tmp/hive/bb53ebb3-6291-458c-a869-398e88041cb5 on fs with scheme file
  406. 2019-05-07 05:23:52,928 ERROR [main] tool.ImportTool: Import failed: java.io.IOException: Hive CliDriver exited with status=1
  407. at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:355)
  408. at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
  409. at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
  410. at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
  411. at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
  412. at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
  413. at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
  414. at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
  415. at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
  416. at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Add Comment
Please, Sign In to add comment