Guest User

Untitled

a guest
Nov 2nd, 2018
115
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 6.90 KB | None | 0 0
  1. sqoop export --connect "jdbc:sqlserver://..." --username=...
  2. --password=... --hcatalog-database ... --hcatalog-table ... --hcatalog-partition-keys ... --hcatalog-partition-values ... --table ... -- --schema ...
  3.  
  4. Log Contents:
  5. 2018-10-31 10:04:12,371 WARN [main] org.apache.hadoop.metrics2.impl.MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-maptask.properties,hadoop-metrics2.properties
  6. 2018-10-31 10:04:12,463 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
  7. 2018-10-31 10:04:12,463 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started
  8. 2018-10-31 10:04:12,475 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:
  9. 2018-10-31 10:04:12,476 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1539329114857_75069, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@1224144a)
  10. 2018-10-31 10:04:12,770 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now.
  11. 2018-10-31 10:04:13,085 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /hadoopmetadata/yarn/local/usercache/.../appcache/application_1539329114857_75069
  12. 2018-10-31 10:04:13,449 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
  13. 2018-10-31 10:04:14,101 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorProcessTree : [ ]
  14. 2018-10-31 10:04:14,372 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: org.apache.sqoop.mapreduce.hcat.SqoopHCatInputSplit@2ca65ce4
  15. 2018-10-31 10:04:14,919 INFO [main] org.apache.hadoop.hive.ql.io.orc.ReaderImpl: Reading ORC rows from hdfs://.../000001_0 with {include: null, offset: 0, length: 1986948}
  16. 2018-10-31 10:04:14,993 INFO [main] org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl: Schema on read not provided -- using file schema [kind: STRUCT
  17. subtypes: 1
  18. subtypes: 2
  19. subtypes: 3
  20. subtypes: 4
  21. subtypes: 5
  22. subtypes: 6
  23. subtypes: 7
  24. subtypes: 8
  25. subtypes: 9
  26. subtypes: 10
  27. subtypes: 11
  28. subtypes: 12
  29. subtypes: 13
  30. subtypes: 14
  31. subtypes: 15
  32. subtypes: 16
  33. subtypes: 17
  34. subtypes: 18
  35. subtypes: 19
  36. subtypes: 20
  37. subtypes: 21
  38. subtypes: 22
  39. subtypes: 23
  40. subtypes: 24
  41. subtypes: 25
  42. subtypes: 26
  43. fieldNames: "_col0"
  44. fieldNames: "_col1"
  45. fieldNames: "_col2"
  46. fieldNames: "_col3"
  47. fieldNames: "_col4"
  48. fieldNames: "_col5"
  49. fieldNames: "_col6"
  50. fieldNames: "_col7"
  51. fieldNames: "_col8"
  52. fieldNames: "_col9"
  53. fieldNames: "_col10"
  54. fieldNames: "_col11"
  55. fieldNames: "_col12"
  56. fieldNames: "_col13"
  57. fieldNames: "_col14"
  58. fieldNames: "_col15"
  59. fieldNames: "_col16"
  60. fieldNames: "_col17"
  61. fieldNames: "_col18"
  62. fieldNames: "_col19"
  63. fieldNames: "_col20"
  64. fieldNames: "_col21"
  65. fieldNames: "_col22"
  66. fieldNames: "_col23"
  67. fieldNames: "_col24"
  68. fieldNames: "_col25"
  69. , kind: DOUBLE
  70. , kind: DOUBLE
  71. , kind: DOUBLE
  72. , kind: DOUBLE
  73. , kind: DOUBLE
  74. , kind: DOUBLE
  75. , kind: LONG
  76. , kind: LONG
  77. , kind: LONG
  78. , kind: DOUBLE
  79. , kind: DOUBLE
  80. , kind: DOUBLE
  81. , kind: DOUBLE
  82. , kind: DOUBLE
  83. , kind: STRING
  84. , kind: STRING
  85. , kind: INT
  86. , kind: INT
  87. , kind: STRING
  88. , kind: STRING
  89. , kind: STRING
  90. , kind: STRING
  91. , kind: STRING
  92. , kind: STRING
  93. , kind: STRING
  94. , kind: STRING
  95. ]
  96. 2018-10-31 10:04:15,080 INFO [main] org.apache.hive.hcatalog.mapreduce.InternalUtil: Initializing org.apache.hadoop.hive.ql.io.orc.OrcSerde with properties {transient_lastDdlTime=1524238479, name=..., serialization.null.format=N, columns=..., serialization.lib=org.apache.hadoop.hive.ql.io.orc.OrcSerde, serialization.format=1, columns.types=double,double,double,double,double,double,bigint,bigint,bigint,double,double,double,double,double,string,string,int,int,string,string,string,string,string,string,string,string, columns.comments=nullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnull}
  97. 2018-10-31 10:04:17,349 WARN [Thread-12] org.apache.sqoop.mapreduce.SQLServerExportDBExecThread: Error executing statement: java.sql.BatchUpdateException: Error converting data type nvarchar to decimal.
  98. 2018-10-31 10:04:17,350 WARN [Thread-12] org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread: Trying to recover from DB write failure:
  99. java.sql.BatchUpdateException: Error converting data type nvarchar to decimal.
  100. at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:1870)
  101. at org.apache.sqoop.mapreduce.SQLServerExportDBExecThread.executeStatement(SQLServerExportDBExecThread.java:96)
  102. at org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread.write(SQLServerAsyncDBExecThread.java:272)
  103. at org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread.run(SQLServerAsyncDBExecThread.java:240)
  104. 2018-10-31 10:04:17,354 WARN [Thread-12] org.apache.sqoop.mapreduce.db.SQLServerConnectionFailureHandler: Cannot handle error with SQL State: S0005
  105. 2018-10-31 10:04:17,354 ERROR [Thread-12] org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread: Failed to write records.
  106. java.io.IOException: Registered handler cannot recover error with SQL State: S0005, error code: 8114
  107. at org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread.write(SQLServerAsyncDBExecThread.java:293)
  108. at org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread.run(SQLServerAsyncDBExecThread.java:240)
  109. Caused by: java.sql.BatchUpdateException: Error converting data type nvarchar to decimal.
  110. at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:1870)
  111. at org.apache.sqoop.mapreduce.SQLServerExportDBExecThread.executeStatement(SQLServerExportDBExecThread.java:96)
  112. at org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread.write(SQLServerAsyncDBExecThread.java:272)
  113. ... 1 more
  114. 2018-10-31 10:04:17,354 ERROR [Thread-12] org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread: Got exception in update thread: java.io.IOException: Registered handler cannot recover error with SQL State: S0005, error code: 8114
  115. at org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread.write(SQLServerAsyncDBExecThread.java:293)
  116. at org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread.run(SQLServerAsyncDBExecThread.java:240)
  117. Caused by: java.sql.BatchUpdateException: Error converting data type nvarchar to decimal.
  118. at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:1870)
  119. at org.apache.sqoop.mapreduce.SQLServerExportDBExecThread.executeStatement(SQLServerExportDBExecThread.java:96)
  120. at org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread.write(SQLServerAsyncDBExecThread.java:272)
  121. ... 1 more
  122.  
  123. 2018-10-31 10:04:17,568 ERROR [main] org.apache.sqoop.mapreduce.SQLServerAsyncDBExecThread: Asynchronous writer thread encountered the following exception: java.io.IOException: Registered handler cannot recover error with SQL State: S0005, error code: 8114
  124. 2018-10-31 10:04:17,569 INFO [Thread-13] org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
  125. End of LogType:syslog
Add Comment
Please, Sign In to add comment