Advertisement
Guest User

Untitled

a guest
Feb 4th, 2014
281
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 6.95 KB | None | 0 0
  1. hadoop@ip-10-202-163-18:~$ HADOOP_HOME=/home/hadoop/ sqoop-1.4.4.bin__hadoop-1.0.0/bin/sqoop import --fields-terminated-by '\t' --lines-terminated-by '\n' --connect jdbc:mysql://**********.us-west-1.rds.amazonaws.com/tlog_stable --table tlog_tloguser --username infoscout --password ****** --num-mappers 6 --hive-import --direct --hive-overwrite --target-dir s3n://****:*********@iakbar.emr/dump/
  2. Warning: /usr/lib/hbase does not exist! HBase imports will fail.
  3. Please set $HBASE_HOME to the root of your HBase installation.
  4. Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
  5. Please set $HCAT_HOME to the root of your HCatalog installation.
  6. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  7. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: It seems that you've specified at least one of following:
  8. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: --hive-home
  9. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: --hive-overwrite
  10. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: --create-hive-table
  11. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: --hive-table
  12. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: --hive-partition-key
  13. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: --hive-partition-value
  14. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: --map-column-hive
  15. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: Without specifying parameter --hive-import. Please note that
  16. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: those arguments will not be used in this session. Either
  17. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: specify --hive-import to apply them correctly or remove them
  18. 14/02/04 19:43:01 WARN tool.BaseSqoopTool: from command line to remove this warning.
  19. 14/02/04 19:43:01 INFO tool.BaseSqoopTool: Please note that --hive-home, --hive-partition-key,
  20. 14/02/04 19:43:01 INFO tool.BaseSqoopTool: hive-partition-value and --map-column-hive options are
  21. 14/02/04 19:43:01 INFO tool.BaseSqoopTool: are also valid for HCatalog imports and exports
  22. 14/02/04 19:43:01 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
  23. 14/02/04 19:43:01 INFO tool.CodeGenTool: Beginning code generation
  24. 14/02/04 19:43:02 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tlog_tloguser` AS t LIMIT 1
  25. 14/02/04 19:43:03 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tlog_tloguser` AS t LIMIT 1
  26. 14/02/04 19:43:03 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop
  27. 14/02/04 19:43:03 INFO orm.CompilationManager: Found hadoop core jar at: /home/hadoop/hadoop-core.jar
  28. Note: /tmp/sqoop-hadoop/compile/53d34ab6c04c6db57eacf8637a4d0972/tlog_tloguser.java uses or overrides a deprecated API.
  29. Note: Recompile with -Xlint:deprecation for details.
  30. 14/02/04 19:43:13 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/53d34ab6c04c6db57eacf8637a4d0972/tlog_tloguser.jar
  31. 14/02/04 19:43:13 INFO manager.DirectMySQLManager: Beginning mysqldump fast path import
  32. 14/02/04 19:43:13 INFO mapreduce.ImportJobBase: Beginning import of tlog_tloguser
  33. 14/02/04 19:43:16 INFO mapred.JobClient: Default number of map tasks: 6
  34. 14/02/04 19:43:16 INFO mapred.JobClient: Default number of reduce tasks: 0
  35. 14/02/04 19:43:18 INFO security.ShellBasedUnixGroupsMapping: add hadoop to shell userGroupsCache
  36. 14/02/04 19:43:18 INFO mapred.JobClient: Setting group to hadoop
  37. 14/02/04 19:43:22 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`user_id`), MAX(`user_id`) FROM tlog_tloguser
  38. 14/02/04 19:43:23 INFO mapred.JobClient: Running job: job_201402041915_0002
  39. 14/02/04 19:43:24 INFO mapred.JobClient: map 0% reduce 0%
  40. 14/02/04 19:44:07 INFO mapred.JobClient: map 16% reduce 0%
  41. 14/02/04 19:44:28 INFO mapred.JobClient: map 33% reduce 0%
  42. 14/02/04 19:44:40 INFO mapred.JobClient: map 50% reduce 0%
  43. 14/02/04 19:44:43 INFO mapred.JobClient: map 66% reduce 0%
  44. 14/02/04 19:44:49 INFO mapred.JobClient: map 100% reduce 0%
  45. 14/02/04 19:45:18 INFO mapred.JobClient: Job complete: job_201402041915_0002
  46. 14/02/04 19:45:18 INFO mapred.JobClient: Counters: 18
  47. 14/02/04 19:45:18 INFO mapred.JobClient: Job Counters
  48. 14/02/04 19:45:18 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=257635
  49. 14/02/04 19:45:18 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
  50. 14/02/04 19:45:18 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
  51. 14/02/04 19:45:18 INFO mapred.JobClient: Launched map tasks=6
  52. 14/02/04 19:45:18 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0
  53. 14/02/04 19:45:18 INFO mapred.JobClient: File Output Format Counters
  54. 14/02/04 19:45:18 INFO mapred.JobClient: Bytes Written=101504699
  55. 14/02/04 19:45:18 INFO mapred.JobClient: FileSystemCounters
  56. 14/02/04 19:45:18 INFO mapred.JobClient: HDFS_BYTES_READ=702
  57. 14/02/04 19:45:18 INFO mapred.JobClient: S3N_BYTES_WRITTEN=101504699
  58. 14/02/04 19:45:18 INFO mapred.JobClient: FILE_BYTES_WRITTEN=209908
  59. 14/02/04 19:45:18 INFO mapred.JobClient: File Input Format Counters
  60. 14/02/04 19:45:18 INFO mapred.JobClient: Bytes Read=0
  61. 14/02/04 19:45:18 INFO mapred.JobClient: Map-Reduce Framework
  62. 14/02/04 19:45:18 INFO mapred.JobClient: Map input records=6
  63. 14/02/04 19:45:18 INFO mapred.JobClient: Physical memory (bytes) snapshot=621346816
  64. 14/02/04 19:45:18 INFO mapred.JobClient: Spilled Records=0
  65. 14/02/04 19:45:18 INFO mapred.JobClient: CPU time spent (ms)=61000
  66. 14/02/04 19:45:18 INFO mapred.JobClient: Total committed heap usage (bytes)=158072832
  67. 14/02/04 19:45:18 INFO mapred.JobClient: Virtual memory (bytes) snapshot=3808821248
  68. 14/02/04 19:45:18 INFO mapred.JobClient: Map output records=390734
  69. 14/02/04 19:45:18 INFO mapred.JobClient: SPLIT_RAW_BYTES=702
  70. 14/02/04 19:45:18 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 123.6715 seconds (0 bytes/sec)
  71. 14/02/04 19:45:18 INFO mapreduce.ImportJobBase: Retrieved 390734 records.
  72. 14/02/04 19:45:18 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tlog_tloguser` AS t LIMIT 1
  73. 14/02/04 19:45:18 WARN hive.TableDefWriter: Column reg_date had to be cast to a less precise type in Hive
  74. 14/02/04 19:45:18 WARN hive.TableDefWriter: Column last_date had to be cast to a less precise type in Hive
  75. 14/02/04 19:45:18 WARN hive.TableDefWriter: Column trips_perday had to be cast to a less precise type in Hive
  76. 14/02/04 19:45:18 WARN hive.TableDefWriter: Column dups_perday had to be cast to a less precise type in Hive
  77. 14/02/04 19:45:18 WARN hive.TableDefWriter: Column illegible_perday had to be cast to a less precise type in Hive
  78. 14/02/04 19:45:18 WARN hive.TableDefWriter: Column percent_transcribed had to be cast to a less precise type in Hive
  79. 14/02/04 19:45:18 ERROR tool.ImportTool: Imported Failed: This file system object (hdfs://10.202.163.18:9000) does not support access to the request path 's3n://******:*********@iakbar.emr/dump/_logs' You possibly called FileSystem.get(conf) when you should have called FileSystem.get(uri, conf) to obtain a file system supporting your path.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement