Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- <configuration>
- <property>
- <name>fs.default.name</name>
- <value>s3://<MY BUCKET NAME>.s3.amazonaws.com:80/</value>
- </property>
- <property>
- <name>fs.s3.awsAccessKeyId</name>
- <value><ACCESS KEY ID></value>
- </property>
- <property>
- <name>fs.s3.awsSecretAccessKey</name>
- <value><SECRET KEY></value>
- </property>
- <property>
- <name>hadoop.tmp.dir</name>
- <value>/var/lib/hadoop-0.20/cache/${user.name}</value>
- </property>
- </configuration>
- Starting Hadoop secondarynamenode daemon (hadoop-secondarynamenode): starting secondarynamenode, logging to /usr/lib/hadoop-0.20/bin/../logs/hadoop-hadoop-secondarynamenode-ip-10-20-60-231.out
- Exception in thread "main" java.lang.IllegalArgumentException: Invalid URI for NameNode address (check fs.default.name): s3://<MY BUCKET NAME>.s3.amazonaws.com:80/ is not of scheme 'hdfs'.
- at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:177)
- at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:131)
- at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:115)
- at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:476)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement