一、環境說明
1)作業系統及核心程式的版本:
OS | Ubuntu12.04.3 LTS |
JDK | jdk-7u40-linux-x64 |
HADOOP | hadoop-1.2.1 |
二、安裝fuse-dfs
使用root使用者完成以下操作:
1、安裝依賴包
apt-get install autoconf automake libtool make gawk g++
2、安裝ant到/usr/ant
ant下載下傳位址:http://mirror.esocc.com/apache/ant/binaries/apache-ant-1.9.3-bin.tar.gz
tar -zxf apache-ant-1.9.3-bin.tar.gz
cp -r apache-ant-1.9.3 /usr/ant
3、解除安裝已有fuse,安裝fuse
apt-get purge fuse
apt-get purge libfuse2
tar -zxf fuse-2.9.3.tar.gz
cd fuse-2.9.3
./configure --prefix=/usr/fuse
make
make install
4、設定環境變量
ln -s /usr/fuse/bin/fusermount /usr/bin/
ln -s /usr/ant/bin/ant /usr/local/bin/
vi /etc/profile
export ANT_HOME=/usr/ant
export FUSE_HOME=/usr/fuse
export OS_ARCH=amd64
export OS_BIT=64
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:${HADOOP_HOME}/build/c++/Linux-$OS_ARCH-$OS_BIT/lib:/usr/local/lib:/usr/lib:$FUSE_HOME/lib
source /etc/profile
5、編譯libhdfs,fuse-dfs與hdfs的接口
cd $HADOOP_HOME/
ant compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1
ln -s c++/Linux-$OS_ARCH-$OS_BIT/lib build/libhdfs
6、編譯fuse-dfs
ln -s /usr/fuse/include/* /usr/include/
ln -s /usr/fuse/lib/libfuse.so /usr/lib/
ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
undefined reference to `fuse_get_context'錯誤處理:複制編譯錯誤的指令将-L 和 -l參數放到最後編譯,如下:(參考 http://wiki.apache.org/hadoop/BuildFuseDfs023 )
cd src/contrib/fuse-dfs/src gcc -Wall -O3 -o fuse_dfs fuse_dfs.o fuse_options.o fuse_trash.o fuse_stat_struct.o fuse_users.o fuse_init.o fuse_connect.o fuse_impls_access.o fuse_impls_chmod.o fuse_impls_chown.o fuse_impls_create.o fuse_impls_flush.o fuse_impls_getattr.o fuse_impls_mkdir.o fuse_impls_mknod.o fuse_impls_open.o fuse_impls_read.o fuse_impls_release.o fuse_impls_readdir.o fuse_impls_rename.o fuse_impls_rmdir.o fuse_impls_statfs.o fuse_impls_symlink.o fuse_impls_truncate.o fuse_impls_utimens.o fuse_impls_unlink.o fuse_impls_write.o -L/usr/hadoop/build/libhdfs -lhdfs -L/lib -lfuse -L/usr/java/jre/lib/amd64/server -ljvm
重新編譯:
7、挂載hdfs到本地
編輯fuse_dfs_wrapper.sh,頭上加入環境參數,尾行修改為如下:
vi $HADOOP_HOME/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh檔案頭增加:
export JAVA_HOME=/usr/java
export HADOOP_HOME=/usr/hadoop
export ANT_HOME=/usr/ant
export FUSE_HOME=/usr/fuse
export PATH=$PATH:$HADOOP_HOME/contrib/fuse_dfs
for f in ls $HADOOP_HOME/lib/*.jar $HADOOP_HOME/*.jar
do
export CLASSPATH=$CLASSPATH:$f
done
export OS_ARCH=amd64
export OS_BIT=64
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:${HADOOP_HOME}/build/c++/Linux-$OS_ARCH-$OS_BIT/lib:/usr/local/lib:/usr/lib:$FUSE_HOME/lib
末尾修改為:
fuse_dfs $@
建立連結
ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/bin
ln -s ${HADOOP_HOME}/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/
挂載hdfs到本地
mkdir -p /manager/hdfs
fuse_dfs_wrapper.sh dfs://mt-hadoop-name-vip:9000 /manager/hdfs