天天看點

HDFS報錯之-無法解析主機

Configuration conf = new Configuration();

        conf.set("dfs.client.use.datanode.hostname","true");
        FileSystem fileSystem = FileSystem.get(new URI("hdfs://xxx:9000/"),conf,"root");

        Path src = new Path("C:\\Users\\Administrator\\Downloads\\idman633.exe");

        Path path = new Path("hdfs://xxx:9000/name");

        fileSystem.copyFromLocalFile(src,path);

        fileSystem.close();
    } 

就是這樣運作的時候報錯

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

java.io.IOException: java.nio.channels.UnresolvedAddressException

at org.apache.hadoop.hdfs.DataStreamer$LastExceptionInStreamer.set(DataStreamer.java:299)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:820)
Caused by: java.nio.channels.UnresolvedAddressException
at sun.nio.ch.Net.checkAddress(Net.java:101)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.hdfs.DataStreamer.createSocketForPipeline(DataStreamer.java:259)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1692)
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1648)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:704)


Process finished with exit code -1
           

搞了很久問了好多人,最後遠端解決了,不是因為防火牆原因,

也不是hadoop配置錯了,原因是在使用雲伺服器,需要在windwos建立域名映射

129.xxx.xxx.xxx  bigdata

就這樣解決了,真的很坑,我最煩配置了 

另外在與遠端主機通信時候需要設定多設定一個參數

HDFS報錯之-無法解析主機

總之大資料坑還是很多 加油!

繼續閱讀