Advertisement
Guest User

Untitled

a guest
Feb 5th, 2016
507
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 39.51 KB | None | 0 0
  1. remote-task AnswerDistributionWorkflow --n-reduce-tasks 10 --host localhost --user edxtma --remote-name analyticstack --skip-setup --local-scheduler --verbose --wait --src hdfs://localhost:9000/data --dest hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest --name pt_1449177792 --output-root hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/course --include "tracking.log-20160204-1454541421.gz" --manifest hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt --base-input-format "org.edx.hadoop.input.ManifestTextInputFormat" --lib-jar hdfs://localhost:9000/edx-analytics-pipeline/packages/edx-analytics-hadoop-util.jar --marker hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/marker --credentials /edx/etc/edx-analytics-pipeline/output.json
  2. Parsed arguments = Namespace(branch='release', host='localhost', job_flow_id=None, job_flow_name=None, launch_task_arguments=['AnswerDistributionWorkflow', '--n-reduce-tasks', '10', '--local-scheduler', '--src', 'hdfs://localhost:9000/data', '--dest', 'hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest', '--name', 'pt_1449177792', '--output-root', 'hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/course', '--include', 'tracking.log-20160204-1454541421.gz', '--manifest', 'hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt', '--base-input-format', 'org.edx.hadoop.input.ManifestTextInputFormat', '--lib-jar', 'hdfs://localhost:9000/edx-analytics-pipeline/packages/edx-analytics-hadoop-util.jar', '--marker', 'hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/marker', '--credentials', '/edx/etc/edx-analytics-pipeline/output.json'], log_path=None, override_config=None, private_key=None, remote_name='analyticstack', repo=None, secure_config=None, secure_config_branch=None, secure_config_repo=None, shell=None, skip_setup=True, sudo_user='hadoop', user='edxtma', vagrant_path=None, verbose=True, wait=True, wheel_url=None, workflow_profiler=None)
  3. Running commands from path = /home/edxtma/pipeline/share/edx.analytics.tasks
  4. Remote name = analyticstack
  5. Running command = ['ssh', '-tt', '-o', 'ForwardAgent=yes', '-o', 'StrictHostKeyChecking=no', '-o', 'UserKnownHostsFile=/dev/null', '-o', 'KbdInteractiveAuthentication=no', '-o', 'PasswordAuthentication=no', '-o', 'User=edxtma', '-o', 'ConnectTimeout=10', 'localhost', "sudo -Hu hadoop /bin/bash -c 'cd /var/lib/analytics-tasks/analyticstack/repo && . $HOME/.bashrc && /var/lib/analytics-tasks/analyticstack/venv/bin/launch-task AnswerDistributionWorkflow --n-reduce-tasks 10 --local-scheduler --src hdfs://localhost:9000/data --dest hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest --name pt_1449177792 --output-root hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/course --include tracking.log-20160204-1454541421.gz --manifest hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt --base-input-format org.edx.hadoop.input.ManifestTextInputFormat --lib-jar hdfs://localhost:9000/edx-analytics-pipeline/packages/edx-analytics-hadoop-util.jar --marker hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/marker --credentials /edx/etc/edx-analytics-pipeline/output.json'"]
  6. Warning: Permanently added 'localhost' (ECDSA) to the list of known hosts.
  7. DEBUG:stevedore.extension:found extension EntryPoint.parse('sqoop-import = edx.analytics.tasks.sqoop:SqoopImportFromMysql')
  8. DEBUG:stevedore.extension:found extension EntryPoint.parse('last-country = edx.analytics.tasks.user_location:LastCountryForEachUser')
  9. DEBUG:stevedore.extension:found extension EntryPoint.parse('enrollment_validation = edx.analytics.tasks.enrollment_validation:CourseEnrollmentValidationTask')
  10. DEBUG:stevedore.extension:found extension EntryPoint.parse('inc-enrollments-report = edx.analytics.tasks.reports.incremental_enrollments:WeeklyIncrementalUsersAndEnrollments')
  11. DEBUG:stevedore.extension:found extension EntryPoint.parse('total-enrollments-report = edx.analytics.tasks.reports.total_enrollments:WeeklyAllUsersAndEnrollments')
  12. DEBUG:stevedore.extension:found extension EntryPoint.parse('load-d-course = edx.analytics.tasks.load_internal_reporting_course:LoadInternalReportingCourseToWarehouse')
  13. DEBUG:stevedore.extension:found extension EntryPoint.parse('database-import = edx.analytics.tasks.database_imports:ImportAllDatabaseTablesTask')
  14. DEBUG:stevedore.extension:found extension EntryPoint.parse('ed_services_report = edx.analytics.tasks.reports.ed_services_financial_report:BuildEdServicesReportTask')
  15. DEBUG:stevedore.extension:found extension EntryPoint.parse('export-student-module = edx.analytics.tasks.database_exports:StudentModulePerCourseAfterImportWorkflow')
  16. DEBUG:stevedore.extension:found extension EntryPoint.parse('calendar = edx.analytics.tasks.calendar_task:CalendarTableTask')
  17. DEBUG:stevedore.extension:found extension EntryPoint.parse('orders = edx.analytics.tasks.reports.orders_import:OrderTableTask')
  18. DEBUG:stevedore.extension:found extension EntryPoint.parse('cybersource = edx.analytics.tasks.reports.cybersource:DailyPullFromCybersourceTask')
  19. DEBUG:stevedore.extension:found extension EntryPoint.parse('load-d-user = edx.analytics.tasks.load_internal_reporting_user:LoadInternalReportingUserToWarehouse')
  20. DEBUG:stevedore.extension:found extension EntryPoint.parse('location-per-course = edx.analytics.tasks.location_per_course:LastCountryOfUser')
  21. DEBUG:stevedore.extension:found extension EntryPoint.parse('payment_reconcile = edx.analytics.tasks.reports.reconcile:ReconcileOrdersAndTransactionsTask')
  22. DEBUG:stevedore.extension:found extension EntryPoint.parse('enrollments-report = edx.analytics.tasks.reports.enrollments:EnrollmentsByWeek')
  23. DEBUG:stevedore.extension:found extension EntryPoint.parse('dump-student-module = edx.analytics.tasks.database_exports:StudentModulePerCourseTask')
  24. DEBUG:stevedore.extension:found extension EntryPoint.parse('noop = edx.analytics.tasks.performance:ParseEventLogPerformanceTask')
  25. DEBUG:stevedore.extension:found extension EntryPoint.parse('user-activity = edx.analytics.tasks.user_activity:CourseActivityWeeklyTask')
  26. DEBUG:stevedore.extension:found extension EntryPoint.parse('paypal = edx.analytics.tasks.reports.paypal:PaypalTransactionsByDayTask')
  27. DEBUG:stevedore.extension:found extension EntryPoint.parse('grade-dist = edx.analytics.tasks.studentmodule_dist:GradeDistFromSqoopToMySQLWorkflow')
  28. DEBUG:stevedore.extension:found extension EntryPoint.parse('enrollments_and_registrations_workflow-manifest = edx.analytics.tasks.reports.enrollments_and_registrations_workflow_manifest:EnrollmentsandRegistrationsWorkflow')
  29. DEBUG:stevedore.extension:found extension EntryPoint.parse('financial_reports = edx.analytics.tasks.reports.finance_reports:BuildFinancialReportsTask')
  30. DEBUG:stevedore.extension:found extension EntryPoint.parse('catalog = edx.analytics.tasks.course_catalog:CourseCatalogWorkflow')
  31. DEBUG:stevedore.extension:found extension EntryPoint.parse('enrollments = edx.analytics.tasks.enrollments:ImportEnrollmentsIntoMysql')
  32. DEBUG:stevedore.extension:found extension EntryPoint.parse('course-enroll = edx.analytics.tasks.course_enroll:CourseEnrollmentChangesPerDay')
  33. DEBUG:stevedore.extension:found extension EntryPoint.parse('export-events = edx.analytics.tasks.event_exports:EventExportTask')
  34. DEBUG:stevedore.extension:found extension EntryPoint.parse('overall_events = edx.analytics.tasks.overall_events:TotalEventsDailyTask')
  35. DEBUG:stevedore.extension:found extension EntryPoint.parse('load-f-user-activity = edx.analytics.tasks.load_internal_reporting_user_activity:LoadInternalReportingUserActivityToWarehouse')
  36. DEBUG:stevedore.extension:found extension EntryPoint.parse('student_engagement = edx.analytics.tasks.student_engagement:StudentEngagementTask')
  37. DEBUG:stevedore.extension:found extension EntryPoint.parse('answer-dist = edx.analytics.tasks.answer_dist:AnswerDistributionPerCourse')
  38. DEBUG:stevedore.extension:found extension EntryPoint.parse('video = edx.analytics.tasks.video:InsertToMysqlAllVideoTask')
  39. DEBUG:stevedore.extension:found extension EntryPoint.parse('insert-into-table = edx.analytics.tasks.mysql_load:MysqlInsertTask')
  40. DEBUG:stevedore.extension:found extension EntryPoint.parse('all_events_report = edx.analytics.tasks.reports.total_events_report:TotalEventsReportWorkflow')
  41. DEBUG:edx.analytics.tasks.launchers.local:Using override.cfg
  42. 2016-02-05 16:22:17,761 INFO 7350 [luigi-interface] worker.py:267 - Scheduled AnswerDistributionWorkflow(overwrite=False, database=reports, credentials=/edx/etc/edx-analytics-pipeline/output.json, name=pt_1449177792, src=('hdfs://localhost:9000/data',), dest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest, include=('tracking.log-20160204-1454541421.gz',), manifest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt, answer_metadata=None, base_input_format=org.edx.hadoop.input.ManifestTextInputFormat, output_root=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/course, marker=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/marker) (PENDING)
  43. 2016-02-05 16:22:17,762 INFO 7350 [luigi-interface] worker.py:267 - Scheduled AnswerDistributionToMySQLTaskWorkflow(database=reports, credentials=/edx/etc/edx-analytics-pipeline/output.json, name=pt_1449177792, src=('hdfs://localhost:9000/data',), dest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest, include=('tracking.log-20160204-1454541421.gz',), manifest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt, answer_metadata=None, base_input_format=org.edx.hadoop.input.ManifestTextInputFormat, overwrite=True) (PENDING)
  44. 2016-02-05 16:22:18,642 INFO 7350 [luigi-interface] worker.py:267 - Scheduled AnswerDistributionPerCourse(name=pt_1449177792, src=('hdfs://localhost:9000/data',), dest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest, include=('tracking.log-20160204-1454541421.gz',), manifest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt, answer_metadata=None, base_input_format=org.edx.hadoop.input.ManifestTextInputFormat) (DONE)
  45. 2016-02-05 16:22:18,643 INFO 7350 [luigi-interface] worker.py:267 - Scheduled ExternalURL(url=/edx/etc/edx-analytics-pipeline/output.json) (DONE)
  46. 2016-02-05 16:22:19,520 INFO 7350 [luigi-interface] worker.py:267 - Scheduled AnswerDistributionOneFilePerCourseTask(output_root=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/course, name=pt_1449177792, src=('hdfs://localhost:9000/data',), dest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest, include=('tracking.log-20160204-1454541421.gz',), manifest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt, answer_metadata=None, base_input_format=org.edx.hadoop.input.ManifestTextInputFormat) (PENDING)
  47. 2016-02-05 16:22:19,521 INFO 7350 [luigi-interface] interface.py:193 - Done scheduling tasks
  48. 2016-02-05 16:22:19,521 INFO 7350 [luigi-interface] worker.py:282 - [pid 7350] Worker Worker(salt=834663771, host=insight-eurotunnel, username=hadoop, pid=7350) running AnswerDistributionToMySQLTaskWorkflow(database=reports, credentials=/edx/etc/edx-analytics-pipeline/output.json, name=pt_1449177792, src=('hdfs://localhost:9000/data',), dest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest, include=('tracking.log-20160204-1454541421.gz',), manifest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt, answer_metadata=None, base_input_format=org.edx.hadoop.input.ManifestTextInputFormat, overwrite=True)
  49. 2016-02-05 16:22:21,596 INFO 7350 [luigi-interface] worker.py:296 - [pid 7350] Worker Worker(salt=834663771, host=insight-eurotunnel, username=hadoop, pid=7350) done AnswerDistributionToMySQLTaskWorkflow(database=reports, credentials=/edx/etc/edx-analytics-pipeline/output.json, name=pt_1449177792, src=('hdfs://localhost:9000/data',), dest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest, include=('tracking.log-20160204-1454541421.gz',), manifest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt, answer_metadata=None, base_input_format=org.edx.hadoop.input.ManifestTextInputFormat, overwrite=True)
  50. 2016-02-05 16:22:21,596 INFO 7350 [luigi-interface] worker.py:282 - [pid 7350] Worker Worker(salt=834663771, host=insight-eurotunnel, username=hadoop, pid=7350) running AnswerDistributionOneFilePerCourseTask(output_root=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/course, name=pt_1449177792, src=('hdfs://localhost:9000/data',), dest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest, include=('tracking.log-20160204-1454541421.gz',), manifest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt, answer_metadata=None, base_input_format=org.edx.hadoop.input.ManifestTextInputFormat)
  51. 2016-02-05 16:22:24,576 INFO 7350 [luigi-interface] hadoop.py:203 - /edx/app/hadoop/hadoop/bin/hadoop jar /edx/app/hadoop/hadoop/share/hadoop/tools/lib/hadoop-streaming-2.3.0.jar -libjars /tmp/tmp1K35QJ/edx-analytics-hadoop-util.jar -D mapred.job.name=AnswerDistributionOneFilePerCourseTask(output_root=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/course, name=pt_1449177792, src=('hdfs://localhost:9000/data',), dest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest, include=('tracking.log-20160204-1454541421.gz',), manifest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt, answer_metadata=None, base_input_format=org.edx.hadoop.input.ManifestTextInputFormat) -D mapred.reduce.tasks=10 -mapper /usr/bin/python2.7 mrrunner.py map -reducer /usr/bin/python2.7 mrrunner.py reduce -file /var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/mrrunner.py -file /tmp/tmp1K35QJ/packages.tar -file /tmp/tmp1K35QJ/job-instance.pickle -input /tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest/answer_distribution_per_course_pt_1449177792 -output /tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/marker/7055871242510585221-temp-2016-02-05T16-22-22.654250
  52. 2016-02-05 16:22:25,150 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:25 INFO Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name
  53. 2016-02-05 16:22:25,150 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:25 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
  54. 2016-02-05 16:22:25,156 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:25 WARN streaming.StreamJob: -file option is deprecated, please use generic option -files instead.
  55. 2016-02-05 16:22:25,695 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:25 INFO client.RMProxy: Connecting to ResourceManager at /127.0.0.1:8032
  56. 2016-02-05 16:22:25,763 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:25 INFO client.RMProxy: Connecting to ResourceManager at /127.0.0.1:8032
  57. 2016-02-05 16:22:26,480 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:26 INFO mapred.FileInputFormat: Total input paths to process : 5
  58. 2016-02-05 16:22:26,563 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:26 INFO mapreduce.JobSubmitter: number of splits:5
  59. 2016-02-05 16:22:26,641 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1454684580767_0004
  60. 2016-02-05 16:22:26,754 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:26 INFO impl.YarnClientImpl: Submitted application application_1454684580767_0004
  61. 2016-02-05 16:22:26,778 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:26 INFO mapreduce.Job: The url to track the job: http://localhost:8088/proxy/application_1454684580767_0004/
  62. 2016-02-05 16:22:26,779 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:26 INFO mapreduce.Job: Running job: job_1454684580767_0004
  63. 2016-02-05 16:22:30,905 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:30 INFO mapreduce.Job: Job job_1454684580767_0004 running in uber mode : false
  64. 2016-02-05 16:22:30,906 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:30 INFO mapreduce.Job: map 0% reduce 0%
  65. 2016-02-05 16:22:41,002 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:41 INFO mapreduce.Job: map 100% reduce 0%
  66. 2016-02-05 16:22:45,028 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:45 INFO mapreduce.Job: map 100% reduce 10%
  67. 2016-02-05 16:22:51,071 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:51 INFO mapreduce.Job: map 100% reduce 30%
  68. 2016-02-05 16:22:51,074 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:51 INFO mapreduce.Job: Task Id : attempt_1454684580767_0004_r_000001_0, Status : FAILED
  69. 2016-02-05 16:22:51,088 INFO 7350 [luigi-interface] hadoop.py:234 - Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 143
  70. 2016-02-05 16:22:51,095 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
  71. 2016-02-05 16:22:51,096 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
  72. 2016-02-05 16:22:51,096 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134)
  73. 2016-02-05 16:22:51,096 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)
  74. 2016-02-05 16:22:51,096 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)
  75. 2016-02-05 16:22:51,097 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
  76. 2016-02-05 16:22:51,103 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
  77. 2016-02-05 16:22:51,103 INFO 7350 [luigi-interface] hadoop.py:234 - at java.security.AccessController.doPrivileged(Native Method)
  78. 2016-02-05 16:22:51,103 INFO 7350 [luigi-interface] hadoop.py:234 - at javax.security.auth.Subject.doAs(Subject.java:422)
  79. 2016-02-05 16:22:51,103 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
  80. 2016-02-05 16:22:51,103 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
  81. 2016-02-05 16:22:52,107 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:52 INFO mapreduce.Job: map 100% reduce 20%
  82. 2016-02-05 16:22:53,112 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:53 INFO mapreduce.Job: map 100% reduce 30%
  83. 2016-02-05 16:22:54,115 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:54 INFO mapreduce.Job: map 100% reduce 40%
  84. 2016-02-05 16:22:56,128 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:56 INFO mapreduce.Job: map 100% reduce 50%
  85. 2016-02-05 16:22:57,135 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:57 INFO mapreduce.Job: map 100% reduce 60%
  86. 2016-02-05 16:22:59,145 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:22:59 INFO mapreduce.Job: map 100% reduce 70%
  87. 2016-02-05 16:23:00,171 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:23:00 INFO mapreduce.Job: map 100% reduce 80%
  88. 2016-02-05 16:23:02,180 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:23:02 INFO mapreduce.Job: map 100% reduce 90%
  89. 2016-02-05 16:23:05,196 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:23:05 INFO mapreduce.Job: Task Id : attempt_1454684580767_0004_r_000001_1, Status : FAILED
  90. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
  91. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
  92. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
  93. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134)
  94. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)
  95. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)
  96. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
  97. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
  98. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at java.security.AccessController.doPrivileged(Native Method)
  99. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at javax.security.auth.Subject.doAs(Subject.java:422)
  100. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
  101. 2016-02-05 16:23:05,198 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
  102. 2016-02-05 16:23:13,230 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:23:13 INFO mapreduce.Job: Task Id : attempt_1454684580767_0004_r_000001_2, Status : FAILED
  103. 2016-02-05 16:23:13,231 INFO 7350 [luigi-interface] hadoop.py:234 - Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
  104. 2016-02-05 16:23:13,231 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
  105. 2016-02-05 16:23:13,231 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
  106. 2016-02-05 16:23:13,231 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134)
  107. 2016-02-05 16:23:13,231 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)
  108. 2016-02-05 16:23:13,231 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)
  109. 2016-02-05 16:23:13,232 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
  110. 2016-02-05 16:23:13,232 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
  111. 2016-02-05 16:23:13,232 INFO 7350 [luigi-interface] hadoop.py:234 - at java.security.AccessController.doPrivileged(Native Method)
  112. 2016-02-05 16:23:13,232 INFO 7350 [luigi-interface] hadoop.py:234 - at javax.security.auth.Subject.doAs(Subject.java:422)
  113. 2016-02-05 16:23:13,232 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
  114. 2016-02-05 16:23:13,232 INFO 7350 [luigi-interface] hadoop.py:234 - at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
  115. 2016-02-05 16:23:22,267 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:23:22 INFO mapreduce.Job: map 100% reduce 100%
  116. 2016-02-05 16:23:22,274 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:23:22 INFO mapreduce.Job: Job job_1454684580767_0004 failed with state FAILED due to: Task failed task_1454684580767_0004_r_000001
  117. 2016-02-05 16:23:22,274 INFO 7350 [luigi-interface] hadoop.py:234 - Job failed as tasks failed. failedMaps:0 failedReduces:1
  118. 2016-02-05 16:23:22,356 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:23:22 INFO mapreduce.Job: Counters: 50
  119. 2016-02-05 16:23:22,356 INFO 7350 [luigi-interface] hadoop.py:234 - File System Counters
  120. 2016-02-05 16:23:22,356 INFO 7350 [luigi-interface] hadoop.py:234 - FILE: Number of bytes read=54
  121. 2016-02-05 16:23:22,356 INFO 7350 [luigi-interface] hadoop.py:234 - FILE: Number of bytes written=1739245
  122. 2016-02-05 16:23:22,357 INFO 7350 [luigi-interface] hadoop.py:234 - FILE: Number of read operations=0
  123. 2016-02-05 16:23:22,357 INFO 7350 [luigi-interface] hadoop.py:234 - FILE: Number of large read operations=0
  124. 2016-02-05 16:23:22,357 INFO 7350 [luigi-interface] hadoop.py:234 - FILE: Number of write operations=0
  125. 2016-02-05 16:23:22,357 INFO 7350 [luigi-interface] hadoop.py:234 - HDFS: Number of bytes read=460431
  126. 2016-02-05 16:23:22,357 INFO 7350 [luigi-interface] hadoop.py:234 - HDFS: Number of bytes written=0
  127. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - HDFS: Number of read operations=42
  128. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - HDFS: Number of large read operations=0
  129. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - HDFS: Number of write operations=18
  130. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - Job Counters
  131. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - Failed reduce tasks=4
  132. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - Launched map tasks=5
  133. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - Launched reduce tasks=13
  134. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - Data-local map tasks=5
  135. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - Total time spent by all maps in occupied slots (ms)=37568
  136. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - Total time spent by all reduces in occupied slots (ms)=87148
  137. 2016-02-05 16:23:22,358 INFO 7350 [luigi-interface] hadoop.py:234 - Total time spent by all map tasks (ms)=37568
  138. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Total time spent by all reduce tasks (ms)=87148
  139. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Total vcore-seconds taken by all map tasks=37568
  140. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Total vcore-seconds taken by all reduce tasks=87148
  141. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Total megabyte-seconds taken by all map tasks=38469632
  142. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Total megabyte-seconds taken by all reduce tasks=89239552
  143. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Map-Reduce Framework
  144. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Map input records=925
  145. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Map output records=925
  146. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Map output bytes=467606
  147. 2016-02-05 16:23:22,359 INFO 7350 [luigi-interface] hadoop.py:234 - Map output materialized bytes=471606
  148. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Input split bytes=1005
  149. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Combine input records=0
  150. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Combine output records=0
  151. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Reduce input groups=0
  152. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Reduce shuffle bytes=270
  153. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Reduce input records=0
  154. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Reduce output records=0
  155. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Spilled Records=925
  156. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Shuffled Maps =45
  157. 2016-02-05 16:23:22,360 INFO 7350 [luigi-interface] hadoop.py:234 - Failed Shuffles=0
  158. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - Merged Map outputs=45
  159. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - GC time elapsed (ms)=2716
  160. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - CPU time spent (ms)=6240
  161. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - Physical memory (bytes) snapshot=2500464640
  162. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - Virtual memory (bytes) snapshot=31271399424
  163. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - Total committed heap usage (bytes)=1863319552
  164. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - Shuffle Errors
  165. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - BAD_ID=0
  166. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - CONNECTION=0
  167. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - IO_ERROR=0
  168. 2016-02-05 16:23:22,361 INFO 7350 [luigi-interface] hadoop.py:234 - WRONG_LENGTH=0
  169. 2016-02-05 16:23:22,362 INFO 7350 [luigi-interface] hadoop.py:234 - WRONG_MAP=0
  170. 2016-02-05 16:23:22,362 INFO 7350 [luigi-interface] hadoop.py:234 - WRONG_REDUCE=0
  171. 2016-02-05 16:23:22,362 INFO 7350 [luigi-interface] hadoop.py:234 - File Input Format Counters
  172. 2016-02-05 16:23:22,362 INFO 7350 [luigi-interface] hadoop.py:234 - Bytes Read=459426
  173. 2016-02-05 16:23:22,362 INFO 7350 [luigi-interface] hadoop.py:234 - File Output Format Counters
  174. 2016-02-05 16:23:22,362 INFO 7350 [luigi-interface] hadoop.py:234 - Bytes Written=0
  175. 2016-02-05 16:23:22,362 INFO 7350 [luigi-interface] hadoop.py:234 - 16/02/05 16:23:22 ERROR streaming.StreamJob: Job not Successful!
  176. 2016-02-05 16:23:22,362 INFO 7350 [luigi-interface] hadoop.py:234 - Streaming Command Failed!
  177. 2016-02-05 16:23:22,376 ERROR 7350 [luigi-interface] worker.py:304 - [pid 7350] Worker Worker(salt=834663771, host=insight-eurotunnel, username=hadoop, pid=7350) failed AnswerDistributionOneFilePerCourseTask(output_root=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/course, name=pt_1449177792, src=('hdfs://localhost:9000/data',), dest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/dest, include=('tracking.log-20160204-1454541421.gz',), manifest=hdfs://localhost:9000/tmp/pipeline-task-scheduler/AnswerDistributionWorkflow/1449177792/manifest.txt, answer_metadata=None, base_input_format=org.edx.hadoop.input.ManifestTextInputFormat)
  178. Traceback (most recent call last):
  179. File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/worker.py", line 292, in _run_task
  180. task.run()
  181. File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/hadoop.py", line 573, in run
  182. self.job_runner().run_job(self)
  183. File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/hadoop.py", line 443, in run_job
  184. run_and_track_hadoop_job(arglist)
  185. File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/hadoop.py", line 279, in run_and_track_hadoop_job
  186. return track_process(arglist, tracking_url_callback, env)
  187. File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/hadoop.py", line 263, in track_process
  188. raise HadoopJobError(message + 'Also, no tracking url found.', out, err)
  189. HadoopJobError: ('Streaming job failed with exit code 1. Also, no tracking url found.', 'packageJobJar: [/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/mrrunner.py, /tmp/tmp1K35QJ/packages.tar, /tmp/tmp1K35QJ/job-instance.pickle, /tmp/hadoop-hadoop/hadoop-unjar2373897905710696655/] [] /tmp/streamjob4180815096956931131.jar tmpDir=null\n', '16/02/05 16:22:25 INFO Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name\n16/02/05 16:22:25 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces\n16/02/05 16:22:25 WARN streaming.StreamJob: -file option is deprecated, please use generic option -files instead.\n16/02/05 16:22:25 INFO client.RMProxy: Connecting to ResourceManager at /127.0.0.1:8032\n16/02/05 16:22:25 INFO client.RMProxy: Connecting to ResourceManager at /127.0.0.1:8032\n16/02/05 16:22:26 INFO mapred.FileInputFormat: Total input paths to process : 5\n16/02/05 16:22:26 INFO mapreduce.JobSubmitter: number of splits:5\n16/02/05 16:22:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1454684580767_0004\n16/02/05 16:22:26 INFO impl.YarnClientImpl: Submitted application application_1454684580767_0004\n16/02/05 16:22:26 INFO mapreduce.Job: The url to track the job: http://localhost:8088/proxy/application_1454684580767_0004/\n16/02/05 16:22:26 INFO mapreduce.Job: Running job: job_1454684580767_0004\n16/02/05 16:22:30 INFO mapreduce.Job: Job job_1454684580767_0004 running in uber mode : false\n16/02/05 16:22:30 INFO mapreduce.Job: map 0% reduce 0%\n16/02/05 16:22:41 INFO mapreduce.Job: map 100% reduce 0%\n16/02/05 16:22:45 INFO mapreduce.Job: map 100% reduce 10%\n16/02/05 16:22:51 INFO mapreduce.Job: map 100% reduce 30%\n16/02/05 16:22:51 INFO mapreduce.Job: Task Id : attempt_1454684580767_0004_r_000001_0, Status : FAILED\nError: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 143\n\tat org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)\n\tat org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)\n\tat org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134)\n\tat org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)\n\tat org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)\n\tat org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)\n\tat org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:422)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)\n\tat org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)\n\n16/02/05 16:22:52 INFO mapreduce.Job: map 100% reduce 20%\n16/02/05 16:22:53 INFO mapreduce.Job: map 100% reduce 30%\n16/02/05 16:22:54 INFO mapreduce.Job: map 100% reduce 40%\n16/02/05 16:22:56 INFO mapreduce.Job: map 100% reduce 50%\n16/02/05 16:22:57 INFO mapreduce.Job: map 100% reduce 60%\n16/02/05 16:22:59 INFO mapreduce.Job: map 100% reduce 70%\n16/02/05 16:23:00 INFO mapreduce.Job: map 100% reduce 80%\n16/02/05 16:23:02 INFO mapreduce.Job: map 100% reduce 90%\n16/02/05 16:23:05 INFO mapreduce.Job: Task Id : attempt_1454684580767_0004_r_000001_1, Status : FAILED\nError: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1\n\tat org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)\n\tat org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)\n\tat org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134)\n\tat org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)\n\tat org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)\n\tat org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)\n\tat org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:422)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)\n\tat org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)\n\n16/02/05 16:23:13 INFO mapreduce.Job: Task Id : attempt_1454684580767_0004_r_000001_2, Status : FAILED\nError: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1\n\tat org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)\n\tat org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)\n\tat org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134)\n\tat org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)\n\tat org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)\n\tat org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)\n\tat org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:422)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)\n\tat org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)\n\n16/02/05 16:23:22 INFO mapreduce.Job: map 100% reduce 100%\n16/02/05 16:23:22 INFO mapreduce.Job: Job job_1454684580767_0004 failed with state FAILED due to: Task failed task_1454684580767_0004_r_000001\nJob failed as tasks failed. failedMaps:0 failedReduces:1\n\n16/02/05 16:23:22 INFO mapreduce.Job: Counters: 50\n\tFile System Counters\n\t\tFILE: Number of bytes read=54\n\t\tFILE: Number of bytes written=1739245\n\t\tFILE: Number of read operations=0\n\t\tFILE: Number of large read operations=0\n\t\tFILE: Number of write operations=0\n\t\tHDFS: Number of bytes read=460431\n\t\tHDFS: Number of bytes written=0\n\t\tHDFS: Number of read operations=42\n\t\tHDFS: Number of large read operations=0\n\t\tHDFS: Number of write operations=18\n\tJob Counters \n\t\tFailed reduce tasks=4\n\t\tLaunched map tasks=5\n\t\tLaunched reduce tasks=13\n\t\tData-local map tasks=5\n\t\tTotal time spent by all maps in occupied slots (ms)=37568\n\t\tTotal time spent by all reduces in occupied slots (ms)=87148\n\t\tTotal time spent by all map tasks (ms)=37568\n\t\tTotal time spent by all reduce tasks (ms)=87148\n\t\tTotal vcore-seconds taken by all map tasks=37568\n\t\tTotal vcore-seconds taken by all reduce tasks=87148\n\t\tTotal megabyte-seconds taken by all map tasks=38469632\n\t\tTotal megabyte-seconds taken by all reduce tasks=89239552\n\tMap-Reduce Framework\n\t\tMap input records=925\n\t\tMap output records=925\n\t\tMap output bytes=467606\n\t\tMap output materialized bytes=471606\n\t\tInput split bytes=1005\n\t\tCombine input records=0\n\t\tCombine output records=0\n\t\tReduce input groups=0\n\t\tReduce shuffle bytes=270\n\t\tReduce input records=0\n\t\tReduce output records=0\n\t\tSpilled Records=925\n\t\tShuffled Maps =45\n\t\tFailed Shuffles=0\n\t\tMerged Map outputs=45\n\t\tGC time elapsed (ms)=2716\n\t\tCPU time spent (ms)=6240\n\t\tPhysical memory (bytes) snapshot=2500464640\n\t\tVirtual memory (bytes) snapshot=31271399424\n\t\tTotal committed heap usage (bytes)=1863319552\n\tShuffle Errors\n\t\tBAD_ID=0\n\t\tCONNECTION=0\n\t\tIO_ERROR=0\n\t\tWRONG_LENGTH=0\n\t\tWRONG_MAP=0\n\t\tWRONG_REDUCE=0\n\tFile Input Format Counters \n\t\tBytes Read=459426\n\tFile Output Format Counters \n\t\tBytes Written=0\n16/02/05 16:23:22 ERROR streaming.StreamJob: Job not Successful!\nStreaming Command Failed!\n')
  190. 2016-02-05 16:23:22,377 INFO 7350 [luigi-interface] notifications.py:96 - Skipping error email. Set `error-email` in the `core` section of the luigi config file to receive error emails.
  191. 2016-02-05 16:23:24,140 INFO 7350 [luigi-interface] worker.py:337 - Done
  192. 2016-02-05 16:23:24,140 INFO 7350 [luigi-interface] worker.py:338 - There are no more tasks to run at this time
  193. 2016-02-05 16:23:24,140 INFO 7350 [luigi-interface] worker.py:343 - There are 1 pending tasks possibly being run by other workers
  194. 2016-02-05 16:23:24,159 INFO 7350 [luigi-interface] worker.py:117 - Worker Worker(salt=834663771, host=insight-eurotunnel, username=hadoop, pid=7350) was stopped. Shutting down Keep-Alive thread
  195. Connection to localhost closed.
  196. Exiting with status = 0
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement