Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- sqoop import --direct --connect jdbc:netezza://umabofanzd05h1:1234/dodsdb01 --table catalog_sales --username admin --password xxxxxxx -m 1
- Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
- 16/08/07 13:38:59 INFO mapreduce.Job: map 0% reduce 0%
- 16/08/07 13:39:12 INFO mapreduce.Job: map 100% reduce 0%
- 16/08/07 14:25:57 INFO mapreduce.Job: Job job_1467176512921_0015 completed successfully
- 16/08/07 14:25:57 INFO mapreduce.Job: Counters: 30
- File System Counters
- FILE: Number of bytes read=0
- FILE: Number of bytes written=142451
- FILE: Number of read operations=0
- FILE: Number of large read operations=0
- FILE: Number of write operations=0
- HDFS: Number of bytes read=70
- HDFS: Number of bytes written=117840708989
- HDFS: Number of read operations=4
- HDFS: Number of large read operations=0
- HDFS: Number of write operations=2
- Job Counters
- Launched map tasks=1
- Other local map tasks=1
- Total time spent by all maps in occupied slots (ms)=2815190
- Total time spent by all reduces in occupied slots (ms)=0
- Total time spent by all map tasks (ms)=2815190
- Total vcore-seconds taken by all map tasks=2815190
- Total megabyte-seconds taken by all map tasks=2882754560
- Map-Reduce Framework
- Map input records=1
- Map output records=576001697
- Input split bytes=70
- Spilled Records=0
- Failed Shuffles=0
- Merged Map outputs=0
- GC time elapsed (ms)=26051
- CPU time spent (ms)=2647890
- Physical memory (bytes) snapshot=302415872
- Virtual memory (bytes) snapshot=1586044928
- Total committed heap usage (bytes)=179306496
- File Input Format Counters
- Bytes Read=0
- File Output Format Counters
- Bytes Written=117840708989
- 16/08/07 14:25:57 INFO mapreduce.ImportJobBase: Transferred 109.7477 GB in 2,831.1265 seconds (39.695 MB/sec)
- 16/08/07 14:25:57 INFO mapreduce.ImportJobBase: Retrieved 576001697 records.
- sqoop import --direct --connect jdbc:netezza://umabofanzd05h1:1234/dodsdb01 --table catalog_sales --username admin --password xxxxxx -m 3
- Warning: /usr/lib/sqoop/../hbase does not exist! HBase imports will fail.
- Please set $HBASE_HOME to the root of your HBase installation.
- Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
- Please set $ACCUMULO_HOME to the root of your Accumulo installation.
- 16/08/08 12:50:29 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.1
- 16/08/08 12:50:29 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
- 16/08/08 12:50:29 INFO manager.SqlManager: Using default fetchSize of 1000
- 16/08/08 12:50:29 INFO tool.CodeGenTool: The connection manager declares that it self manages mapping between records & fields and rows & columns. will be generated.
- 16/08/08 12:50:29 INFO manager.DirectNetezzaManager: Beginning netezza fast path import
- 16/08/08 12:50:29 INFO mapreduce.ImportJobBase: Beginning import of catalog_sales
- 16/08/08 12:50:30 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
- 16/08/08 12:50:30 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM "catalog_sales" AS t WHERE 1=0
- 16/08/08 12:50:31 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
- 16/08/08 12:50:31 INFO client.RMProxy: Connecting to ResourceManager at dmaubofadev01.bankofamerica.com/159.127.3.179:8032
- 16/08/08 12:50:35 INFO mapreduce.JobSubmitter: number of splits:3
- 16/08/08 12:50:36 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1467176512921_0018
- 16/08/08 12:50:36 INFO impl.YarnClientImpl: Submitted application application_1467176512921_0018
- 16/08/08 12:50:37 INFO mapreduce.Job: The url to track the job: http://dmaubofadev01.bankofamerica.com:8088/proxy/application_1467176512921_0018/
- 16/08/08 12:50:37 INFO mapreduce.Job: Running job: job_1467176512921_0018
- 16/08/08 12:50:47 INFO mapreduce.Job: Job job_1467176512921_0018 running in uber mode : false
- 16/08/08 12:50:47 INFO mapreduce.Job: map 0% reduce 0%
- 16/08/08 12:51:01 INFO mapreduce.Job: map 33% reduce 0%
- 16/08/08 12:51:05 INFO mapreduce.Job: map 67% reduce 0%
- 16/08/08 12:57:43 INFO mapreduce.Job: Task Id : attempt_1467176512921_0018_m_000000_0, Status : FAILED
- Error: java.io.IOException: java.sql.SQLException: Error occured while writing to file.
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:232)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:53)
- at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
- at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
- at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
- at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
- at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:415)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
- at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
- Caused by: java.sql.SQLException: Error occured while writing to file.
- at org.netezza.externaltable.NzUnload.cleanupUnload(NzUnload.java:238)
- at org.netezza.externaltable.NzUnload.unload(NzUnload.java:143)
- at org.netezza.internal.QueryExecutor.getNextResult(QueryExecutor.java:252)
- at org.netezza.internal.QueryExecutor.execute(QueryExecutor.java:76)
- at org.netezza.sql.NzConnection.execute(NzConnection.java:2904)
- at org.netezza.sql.NzStatement._execute(NzStatement.java:885)
- at org.netezza.sql.NzPreparedStatament.execute(NzPreparedStatament.java:187)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaJDBCStatementRunner.run(NetezzaJDBCStatementRunner.java:75)
- 16/08/08 12:57:44 INFO mapreduce.Job: map 33% reduce 0%
- 16/08/08 12:57:48 INFO mapreduce.Job: Task Id : attempt_1467176512921_0018_m_000001_0, Status : FAILED
- Exception from container-launch.
- Container id: container_1467176512921_0018_01_000003
- Exit code: 1
- Stack trace: ExitCodeException exitCode=1:
- at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
- at org.apache.hadoop.util.Shell.run(Shell.java:478)
- at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
- at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:213)
- at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
- at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
- at java.util.concurrent.FutureTask.run(FutureTask.java:262)
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
- at java.lang.Thread.run(Thread.java:745)
- Container exited with a non-zero exit code 1
- 16/08/08 12:57:49 INFO mapreduce.Job: map 0% reduce 0%
- 16/08/08 12:57:50 INFO mapreduce.Job: Task Id : attempt_1467176512921_0018_m_000000_1, Status : FAILED
- Error: java.lang.InterruptedException
- at java.lang.Object.wait(Native Method)
- at java.lang.Thread.join(Thread.java:1281)
- at java.lang.Thread.join(Thread.java:1355)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:227)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:53)
- at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
- at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
- at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
- at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
- at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:415)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
- at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
- 16/08/08 12:57:59 INFO mapreduce.Job: map 33% reduce 0%
- 16/08/08 12:58:02 INFO mapreduce.Job: map 67% reduce 0%
- 16/08/08 13:01:41 INFO mapreduce.Job: Task Id : attempt_1467176512921_0018_m_000002_0, Status : FAILED
- Error: java.io.IOException: java.sql.SQLException: Error occured while writing to file.
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:232)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:53)
- at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
- at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
- at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
- at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
- at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:415)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
- at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
- Caused by: java.sql.SQLException: Error occured while writing to file.
- at org.netezza.externaltable.NzUnload.cleanupUnload(NzUnload.java:238)
- at org.netezza.externaltable.NzUnload.unload(NzUnload.java:143)
- at org.netezza.internal.QueryExecutor.getNextResult(QueryExecutor.java:252)
- at org.netezza.internal.QueryExecutor.execute(QueryExecutor.java:76)
- at org.netezza.sql.NzConnection.execute(NzConnection.java:2904)
- at org.netezza.sql.NzStatement._execute(NzStatement.java:885)
- at org.netezza.sql.NzPreparedStatament.execute(NzPreparedStatament.java:187)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaJDBCStatementRunner.run(NetezzaJDBCStatementRunner.java:75)
- 16/08/08 13:01:42 INFO mapreduce.Job: map 33% reduce 0%
- 16/08/08 13:01:45 INFO mapreduce.Job: Task Id : attempt_1467176512921_0018_m_000001_1, Status : FAILED
- Error: java.io.IOException: java.sql.SQLException: Error occured while writing to file.
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:232)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:53)
- at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
- at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
- at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
- at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
- at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:415)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
- at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
- Caused by: java.sql.SQLException: Error occured while writing to file.
- at org.netezza.externaltable.NzUnload.cleanupUnload(NzUnload.java:238)
- at org.netezza.externaltable.NzUnload.unload(NzUnload.java:143)
- at org.netezza.internal.QueryExecutor.getNextResult(QueryExecutor.java:252)
- at org.netezza.internal.QueryExecutor.execute(QueryExecutor.java:76)
- at org.netezza.sql.NzConnection.execute(NzConnection.java:2904)
- at org.netezza.sql.NzStatement._execute(NzStatement.java:885)
- at org.netezza.sql.NzPreparedStatament.execute(NzPreparedStatament.java:187)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaJDBCStatementRunner.run(NetezzaJDBCStatementRunner.java:75)
- 16/08/08 13:01:46 INFO mapreduce.Job: map 0% reduce 0%
- 16/08/08 13:01:48 INFO mapreduce.Job: Task Id : attempt_1467176512921_0018_m_000000_2, Status : FAILED
- Error: java.io.IOException: java.sql.SQLException: Error occured while writing to file.
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:232)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:53)
- at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
- at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
- at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
- at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
- at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:415)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
- at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
- Caused by: java.sql.SQLException: Error occured while writing to file.
- at org.netezza.externaltable.NzUnload.cleanupUnload(NzUnload.java:238)
- at org.netezza.externaltable.NzUnload.unload(NzUnload.java:143)
- at org.netezza.internal.QueryExecutor.getNextResult(QueryExecutor.java:252)
- at org.netezza.internal.QueryExecutor.execute(QueryExecutor.java:76)
- at org.netezza.sql.NzConnection.execute(NzConnection.java:2904)
- at org.netezza.sql.NzStatement._execute(NzStatement.java:885)
- at org.netezza.sql.NzPreparedStatament.execute(NzPreparedStatament.java:187)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaJDBCStatementRunner.run(NetezzaJDBCStatementRunner.java:75)
- 16/08/08 13:01:59 INFO mapreduce.Job: map 33% reduce 0%
- 16/08/08 13:02:01 INFO mapreduce.Job: map 67% reduce 0%
- 16/08/08 13:05:54 INFO mapreduce.Job: Task Id : attempt_1467176512921_0018_m_000002_1, Status : FAILED
- Error: java.lang.InterruptedException
- at java.lang.Object.wait(Native Method)
- at java.lang.Thread.join(Thread.java:1281)
- at java.lang.Thread.join(Thread.java:1355)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:227)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:53)
- at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
- at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
- at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
- at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
- at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:415)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
- at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
- 16/08/08 13:05:55 INFO mapreduce.Job: map 33% reduce 0%
- 16/08/08 13:05:55 INFO mapreduce.Job: Task Id : attempt_1467176512921_0018_m_000001_2, Status : FAILED
- Error: java.lang.InterruptedException
- at java.lang.Object.wait(Native Method)
- at java.lang.Thread.join(Thread.java:1281)
- at java.lang.Thread.join(Thread.java:1355)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:227)
- at org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableImportMapper.map(NetezzaExternalTableImportMapper.java:53)
- at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
- at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
- at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
- at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
- at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:415)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
- at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
- 16/08/08 13:05:56 INFO mapreduce.Job: map 0% reduce 0%
- 16/08/08 13:06:02 INFO mapreduce.Job: map 100% reduce 0%
- 16/08/08 13:06:02 INFO mapreduce.Job: Job job_1467176512921_0018 failed with state FAILED due to: Task failed task_1467176512921_0018_m_000000
- Job failed as tasks failed. failedMaps:1 failedReduces:0
- 16/08/08 13:06:02 INFO mapreduce.Job: Counters: 11
- Job Counters
- Failed map tasks=9
- Killed map tasks=1
- Launched map tasks=10
- Other local map tasks=10
- Total time spent by all maps in occupied slots (ms)=1813316
- Total time spent by all map tasks (ms)=1813316
- Total vcore-seconds taken by all map tasks=1813316
- Total megabyte-seconds taken by all map tasks=1856835584
- Map-Reduce Framework
- CPU time spent (ms)=0
- Physical memory (bytes) snapshot=0
- Virtual memory (bytes) snapshot=0
- 16/08/08 13:06:02 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
- 16/08/08 13:06:02 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 931.1929 seconds (0 bytes/sec)
- 16/08/08 13:06:02 INFO mapreduce.ImportJobBase: Retrieved 0 records.
- 16/08/08 13:06:02 ERROR tool.ImportTool: Error during import: Import job failed!
- Hadoop 2.6.0-cdh5.4.3
- Subversion http://github.com/cloudera/hadoop -r 4cd9f51a3f1ef748d45b8d77d0f211ad44296d4b
- Compiled by jenkins on 2015-06-25T02:34Z
- Compiled with protoc 2.5.0
- From source with checksum 4acea6ac185376e0b48b33695e88e7a7
- This command was run using /usr/lib/hadoop/hadoop-common-2.6.0-cdh5.4.3.jar
- 16/08/09 09:50:09 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.3
- Sqoop 1.4.5-cdh5.4.3
- git commit id
- Compiled by jenkins on Wed Jun 24 19:29:11 PDT 2015
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement