
C/C++程式通過hdfs.h通路HDFS,運作時遇到如下錯誤,會是什麼原因了?(注:hadoop安裝在/data/hadoop/hadoop-2.4.0,而/data/hadoop/current是指向它的軟連結):
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=172.25.40.171, port=9001, kerbTicketCachePath=(NULL), userName=(NULL)) error:
E0507 19:02:57.251287 17859 hdfs_persistence.cpp:31] connect hdfs://172.25.40.171:9001 error: 未知的錯誤 255
上述資訊中的關鍵項是“NoClassDefFoundError”和“ExceptionUtils”,也就是找不到ExceptionUtils,一般可推斷是因為找不到相應的jar檔案,Google搜尋“ExceptionUtils jar”,發現“ExceptionUtils”應當是在包apache-commons-lang.jar中。
進一步用Google去搜尋“apache-commons-lang.jar”,找到下載下傳網址:http://commons.apache.org/proper/commons-lang/download_lang.cgi,上面可以下載下傳commons-lang3-3.3.2-bin.tar.gz,解壓後就可以看到commons-lang3-3.3.2.jar。
hadoop的二進制安裝包,應當自帶了這個檔案,通過努力,在hadoop安裝目錄下的share/hadoop/tools/lib子目錄下發現了commons-lang-2.6.jar,應當就是它了。
使用指令“hadoop classpath”可以,可以檢視hadoop的classpath:
./hadoop classpath
/data/hadoop/hadoop-2.4.0/etc/hadoop:/data/hadoop/hadoop-2.4.0/share/hadoop/common/lib/*:/data/hadoop/hadoop-2.4.0/share/hadoop/common/*:/data/hadoop/hadoop-2.4.0/share/hadoop/hdfs:/data/hadoop/hadoop-2.4.0/share/hadoop/hdfs/lib/*:/data/hadoop/hadoop-2.4.0/share/hadoop/hdfs/*:/data/hadoop/hadoop-2.4.0/share/hadoop/yarn/lib/*:/data/hadoop/hadoop-2.4.0/share/hadoop/yarn/*:/data/hadoop/hadoop-2.4.0/share/hadoop/mapreduce/lib/*:/data/hadoop/hadoop-2.4.0/share/hadoop/mapreduce/*:/data/hadoop/current/contrib/capacity-scheduler/*.jar
遺憾的是,tools并沒有出現,這應當就是問題所在,于是手工将它加進去(hadoop被安裝在/data/hadoop/current):
export CLASSPATH=`/data/hadoop/current/bin/hadoop classpath`:/data/hadoop/current/share/hadoop/tools/lib
重運作,發現還是不行:
E0507 19:52:48.197748 27787 hdfs_persistence.cpp:31] connect hdfs://172.25.40.171:9001 error: 未知的錯誤 255
好吧,來把硬的,直接指定commons-lang-2.6.jar:
export CLASSPATH=`/data/hadoop/current/bin/hadoop classpath`:/data/hadoop/current/share/hadoop/tools/lib/commons-lang-2.6.jar
重新運作程式,ExceptionUtils錯誤消失了,但遇到新錯誤:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FileSystem
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FileSystem
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
E0507 19:12:56.522274 19834 hdfs_persistence.cpp:31] connect hdfs://172.25.40.171:9001 error: 未知的錯誤 255
仍然是NoClassDefFoundError錯誤,原因應當是一樣的:classpath中漏了哪個目錄,這就要看FileSystem和Configuration在哪個jar中了。
FileSystem是在hadoop-common-2.4.0.jar中,而Configuration在commons-configuration-1.6.jar中,它們兩個都已經在classpath上,為何還報錯了?
對java不熟,嘗試将hadoop-common-2.4.0.jar和commons-configuration-1.6.jar直接加入到classpath:
export CLASSPATH=`/data/hadoop/current/bin/hadoop classpath`:/data/hadoop/current/share/hadoop/tools/lib/commons-lang-2.6.jar:/data/hadoop/current/share/hadoop/common/hadoop-common-2.4.0.jar:/data/hadoop/current/share/hadoop/common/lib/commons-configuration-1.6.jar
發現FileSystem和Configuration錯誤消失了,說明有效:
java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.hadoop.fs.FileSystem.(FileSystem.java:95)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
... 1 more
at org.apache.hadoop.conf.Configuration.(Configuration.java:169)
通過尋找,LogFactory在tools目錄下的commons-logging-1.1.3.jar中,把它也加入到classpath中:
export CLASSPATH=`/data/hadoop/current/bin/hadoop classpath`:/data/hadoop/current/share/hadoop/tools/lib/commons-lang-2.6.jar:/data/hadoop/current/share/hadoop/common/hadoop-common-2.4.0.jar:/data/hadoop/current/share/hadoop/common/lib/commons-configuration-1.6.jar:/data/hadoop/current/share/hadoop/common/lib/commons-logging-1.1.3.jar
再次運作,還是報錯:
java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.viewfs.ViewFileSystem could not be instantiated: java.lang.NoClassDefFoundError: com/google/common/collect/Maps
at java.util.ServiceLoader.fail(ServiceLoader.java:232)
at java.util.ServiceLoader.access$100(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:384)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2364)
還是類似的錯誤,這樣下會去搞死人。通過上述的一些操作,估計需要将所有的jar檔案一個個的将入到classpath中。由于對java不熟悉,也隻有先這樣做一做了:
find /data/hadoop/current/ -name *.jar|awk '{ printf("export CLASSPATH=%s:$CLASSPATH\n", $0); }'
終于可以了^_^,好折騰啊。