Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Logged in as: dr.who
- Application
- About
- Jobs
- Tools
- Log Type: syslog
- Log Upload Time: Sun Nov 25 08:43:13 +0000 2018
- Log Length: 9306
- 2018-11-25 08:29:13,707 INFO [main] org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
- 2018-11-25 08:29:13,771 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
- 2018-11-25 08:29:13,771 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started
- 2018-11-25 08:29:13,778 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:
- 2018-11-25 08:29:13,779 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1524597382992_21544, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@31f9b85e)
- 2018-11-25 08:29:13,808 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: YARN_AM_RM_TOKEN, Service: 10.19.65.12:8030,10.19.65.13:8030, Ident: (org.apache.hadoop.yarn.security.AMRMTokenIdentifier@2cdd0d4b)
- 2018-11-25 08:29:13,848 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: RM_DELEGATION_TOKEN, Service: 10.19.65.12:8032,10.19.65.13:8032, Ident: (RM_DELEGATION_TOKEN owner=dnet.beta, renewer=oozie mr token, realUser=oozie, issueDate=1543133170537, maxDate=1543737970537, sequenceNumber=288386, masterKeyId=1084)
- 2018-11-25 08:29:13,881 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now.
- 2018-11-25 08:29:14,056 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /data/1/yarn/nm/usercache/dnet.beta/appcache/application_1524597382992_21544,/data/2/yarn/nm/usercache/dnet.beta/appcache/application_1524597382992_21544,/data/3/yarn/nm/usercache/dnet.beta/appcache/application_1524597382992_21544,/data/4/yarn/nm/usercache/dnet.beta/appcache/application_1524597382992_21544
- 2018-11-25 08:29:14,282 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
- 2018-11-25 08:29:14,779 INFO [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output Committer Algorithm version is 1
- 2018-11-25 08:29:14,789 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorProcessTree : [ ]
- 2018-11-25 08:29:14,983 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: Number of splits :1
- Total Length = 134217728
- Input split[0]:
- Length = 134217728
- Locations:
- -----------------------
- 2018-11-25 08:29:14,999 INFO [main] org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader: Current split being processed hdfs://nameservice1/tmp/temp2000951168/tmp1857898809/part-r-00001:0+134217728
- 2018-11-25 08:29:15,107 INFO [main] org.apache.hadoop.mapred.MapTask: (EQUATOR) 0 kvi 67108860(268435440)
- 2018-11-25 08:29:15,107 INFO [main] org.apache.hadoop.mapred.MapTask: mapreduce.task.io.sort.mb: 256
- 2018-11-25 08:29:15,107 INFO [main] org.apache.hadoop.mapred.MapTask: soft limit at 214748368
- 2018-11-25 08:29:15,107 INFO [main] org.apache.hadoop.mapred.MapTask: bufstart = 0; bufvoid = 268435456
- 2018-11-25 08:29:15,107 INFO [main] org.apache.hadoop.mapred.MapTask: kvstart = 67108860; length = 16777216
- 2018-11-25 08:29:15,115 INFO [main] org.apache.hadoop.mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
- 2018-11-25 08:29:15,155 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS
- 2018-11-25 08:29:15,184 INFO [main] org.apache.hadoop.mapreduce.lib.input.FileInputFormat: Total input paths to process : 1
- 2018-11-25 08:29:15,184 INFO [main] org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil: Total input paths to process : 1
- 2018-11-25 08:29:15,197 INFO [main] org.apache.hadoop.io.compress.zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
- 2018-11-25 08:29:15,219 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.deflate]
- 2018-11-25 08:29:15,296 INFO [main] org.apache.pig.data.SchemaTupleBackend: Key [pig.schematuple] was not set... will not generate code.
- 2018-11-25 08:29:15,342 INFO [main] org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Map: Aliases being processed per job phase (AliasName[line,offset]): M: wc_ranked[128,12] C: R:
- 2018-11-25 08:30:19,802 INFO [Service Thread] org.apache.pig.impl.util.SpillableMemoryManager: first memory handler call- Usage threshold init = 1402470400(1369600K) used = 6087127680(5944460K) committed = 7158628352(6990848K) max = 7158628352(6990848K)
- 2018-11-25 08:31:14,239 INFO [Service Thread] org.apache.pig.impl.util.SpillableMemoryManager: Spilled an estimate of 17575035332 bytes from 1 objects. init = 1402470400(1369600K) used = 6087127680(5944460K) committed = 7158628352(6990848K) max = 7158628352(6990848K)
- 2018-11-25 08:31:44,958 INFO [main] org.apache.hadoop.mapred.MapTask: (EQUATOR) 0 kvi 67108860(268435440)
- 2018-11-25 08:31:44,958 INFO [main] org.apache.hadoop.mapred.MapTask: Record too large for in-memory buffer: 268435471 bytes
- 2018-11-25 08:31:44,966 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new compressor [.snappy]
- 2018-11-25 08:32:36,898 INFO [Service Thread] org.apache.pig.impl.util.SpillableMemoryManager: Spilled an estimate of 7768757572 bytes from 1 objects. init = 1402470400(1369600K) used = 6746811088(6588682K) committed = 7158628352(6990848K) max = 7158628352(6990848K)
- 2018-11-25 08:32:41,378 INFO [main] org.apache.hadoop.mapred.MapTask: Starting flush of map output
- 2018-11-25 08:32:41,380 INFO [main] org.apache.hadoop.io.compress.CodecPool: Got brand-new compressor [.snappy]
- 2018-11-25 08:32:41,424 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError
- at java.io.ByteArrayOutputStream.hugeCapacity(ByteArrayOutputStream.java:123)
- at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:117)
- at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
- at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153)
- at java.io.DataOutputStream.write(DataOutputStream.java:107)
- at java.io.DataOutputStream.writeUTF(DataOutputStream.java:401)
- at java.io.DataOutputStream.writeUTF(DataOutputStream.java:323)
- at org.apache.pig.data.utils.SedesHelper.writeChararray(SedesHelper.java:66)
- at org.apache.pig.data.BinInterSedes.writeDatum(BinInterSedes.java:580)
- at org.apache.pig.data.BinInterSedes.writeDatum(BinInterSedes.java:462)
- at org.apache.pig.data.utils.SedesHelper.writeGenericTuple(SedesHelper.java:135)
- at org.apache.pig.data.BinInterSedes.writeTuple(BinInterSedes.java:650)
- at org.apache.pig.data.BinInterSedes.writeBag(BinInterSedes.java:641)
- at org.apache.pig.data.BinInterSedes.writeDatum(BinInterSedes.java:474)
- at org.apache.pig.data.BinInterSedes.writeDatum(BinInterSedes.java:462)
- at org.apache.pig.data.utils.SedesHelper.writeGenericTuple(SedesHelper.java:135)
- at org.apache.pig.data.BinInterSedes.writeTuple(BinInterSedes.java:650)
- at org.apache.pig.data.BinInterSedes.writeDatum(BinInterSedes.java:470)
- at org.apache.pig.data.BinSedesTuple.write(BinSedesTuple.java:40)
- at org.apache.pig.impl.io.PigNullableWritable.write(PigNullableWritable.java:126)
- at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:98)
- at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:82)
- at org.apache.hadoop.mapred.IFile$Writer.append(IFile.java:206)
- at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.spillSingleRecord(MapTask.java:1700)
- at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1184)
- at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:715)
- at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
- at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
- at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Map.collect(PigGenericMapReduce.java:122)
- at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.runPipeline(PigGenericMapBase.java:284)
- at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:277)
- at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapBase.map(PigGenericMapBase.java:64)
- at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
- at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
- at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
- at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
- at java.security.AccessController.doPrivileged(Native Method)
- at javax.security.auth.Subject.doAs(Subject.java:422)
- at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
- at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
- 2018-11-25 08:32:41,527 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics system...
- 2018-11-25 08:32:41,528 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system stopped.
- 2018-11-25 08:32:41,528 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system shutdown complete.
Advertisement
Add Comment
Please, Sign In to add comment