fuse-dfs挂载全过程


fuse-dfs挂载全过程
 
fuse-dfs挂载终于成功了,断断续续弄了两周多,而最后的一步挂载出错花了我一周时间,加了个HDFS QQ群问了之后才知道哪里弄错了,且听细细道来。
 
fuse-dfs挂载全过程
 
准备工作:
 
CentOS 6.3,Hadoop 1.2.0, jdk 1.6.0_45,fuse 2.8.4,ant 1.9.1
 
1.安装fuse
 
yum install fuse fuse-libs fuse-devel  
 
2.安装ant
 
官网下载,解压
 
3.系统配置
 
vi /etc/profile
 
最后添加:
 
export OS_ARCH=i386     #如果是64位机器填amd64
export OS_BIT=32      #64
export JAVA_HOME=/usr/java/jdk1.6.0_45
export ANT_HOME=/usr/ant
export PATH=$JAVA_HOME/bin:$ANT_HOME/bin:$PATH
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$HADOOP_HOME/lib:$HADOOP_HOME:$CLASSPATH
export HADOOP_HOME=/usr/hadoop
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:$HADOOP_HOME/c++/Linux-$OS_ARCH-$OS_BIT/lib:/usr/local/lib:/usr/lib
 
退出,编译 source /etc/profile
 
4.编译libhdfs
 
cd $HADOOP_HOME
ant compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1
ln -s c++/Linux-$OS_ARCH-$OS_BIT/lib build/libhdfs
ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
 
(提示:1.如果编译没通过,缺少依赖包,yum install automakeautoconf m4 libtool pkgconfig fuse fuse-devel fuse-libs
2.在安装的过程中还要安装gcc 。编译成功会提示 build successful,看到这句心情非常愉悦)
 
5.环境配置
 
cd $HADOOP_HOME/build/contrib/fuse-dfs
vi fuse_dfs_wrapper.sh
在文件最前面添加:
export JAVA_HOME=/usr/java/jdk1.6.0_45
export HADOOP_HOME=/usr/hadoop
export HADOOP_CONF_DIR=/usr/hadoop/conf
export OS_ARCH=i386
export OS_BIT=32
把最后一句“./fuse_dfs$@” 改成 “fuse_dfs@”
 
6.添加权限
 
$chmod +x /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh 
$chmod +x  /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs 
$ln -s  /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh /usr/local/bin 
$ln -s  /usr/hadoop/build/contrib/fuse-dfs/fuse_dfs /usr/local/bin/
 
7.挂载
 
mkdir /mnt/dfs
cd $HADOOP_HOME/build/contrib/fuse-dfs
fuse_dfs_wrapper.sh dfs://localhost:9000 /mnt/dfs
(就是这最后一步!!糊弄了我一周!!关于fuse_dfs_wrapper.sh后面跟的这个链接,我一直遵循)conf/core-site.xml里设置的value值:hdfs://localhost:9000,一直报错fuse-dfs didn't recognize hdfs://localhost:9000,-2  fuse-dfs didn't recognize /mnt/dfs,-2 )
 
最后ls /mnt/dfs就可以看到hdfs里的文件了

相关内容

    暂无相关文章