您的位置:首页 > 运维架构

编译安装hadoop2.6.3

2016-01-19 18:14 295 查看
一、安装环境

1.1 JAVA
安装java1.7
下载jdk1.7:
[root@node1~]# wget http://download.oracle.com/otn-pub/java/jdk/7u79-b15/jdk-7u79-linux-x64.tar.gz?AuthParam=1452765180_64b65bb908cae46ab9a9e492c842d7c7
设置JAVA环境变量:

PATH=$PATH:$HOME/bin:/usr/local/mongodb-linux-x86_64-3.2.0/bin
JAVA_HOME=/usr/local/jdk1.7.0_79
CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$JRE_HOME/lib:$CLASS
PATH=$JAVA_HOME/bin:$PATH

1.2 Maven
下载安装maven:
[root@node1~]# wget http://mirrors.cnnic.cn/apache/maven/maven-3/3.3.9/binaries/apache-maven-3.3.9-bin.tar.gz
然后解压,到/usr/local/,配置path

PATH=$JAVA_HOME/bin:/usr/local/apache-maven-3.3.9/bin:$PATH

1.3 Findbugs
可选安装,略

1.4 protobuf
参考:http://blog.csdn.net/huguoping830623/article/details/45482725
安装:./configure --prefix=/usr/local/protobuf2.5/
make && make install

安装中可能报错:
configure: error: C++ preprocessor "/lib/cpp" fails sanity check
原因是需要gcc相关包:

  # yum install glibc-headers
  # yum install gcc-c++

安装后把/usr/local/protobuf2.5/bin目录加入PATH

1.5 其它包

yum -y install  lzo-devel  zlib-devel  gcc autoconf automake libtool openssl-devel fuse-devel cmake


* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance )
* Jansson C XML parsing library ( if compiling libwebhdfs )



./configure

make && make install



* Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs )

[root@node1 ~]# wgethttps://github.com/libfuse/libfuse/releases/download/fuse_2_9_4/fuse-2.8.6.tar.gz

tar -zxvf fuse-2.8.6.tar.gz

./configure
make -j8
make install


* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

二、编译安装
1.下载hadoop源文件
[root@node1~]# wget http://www.apache.org/dyn/closer.cgi/hadoop/common/hadoop-2.6.3/hadoop-2.6.3-src.tar.gz
然后解压

2.编译hadoop文件
进入hadoop源文件的解压目录,执行:
$ mvn package -Pdist,native -DskipTests -Dtar
要保证能联网,因此要从网上下载依赖包
编译过程要很久。编译完成后放在 hadoop-2.6.3-src/hadoop-dist/target下。

[root@node1 target]# ls -l
total 530484
drwxr-xr-x. 2 root root 4096 Jan 19 13:13 antrun
-rw-r--r--. 1 root root 1867 Jan 19 13:13 dist-layout-stitching.sh
-rw-r--r--. 1 root root 640 Jan 19 13:13 dist-tar-stitching.sh
drwxr-xr-x. 9 root root 4096 Jan 19 13:13 hadoop-2.6.3 hadoop编译后的解压文件
-rw-r--r--. 1 root root 180792661 Jan 19 13:13 hadoop-2.6.3.tar.gz hadoop安装文件
-rw-r--r--. 1 root root 2778 Jan 19 13:13 hadoop-dist-2.6.3.jar
-rw-r--r--. 1 root root 362386511 Jan 19 13:13 hadoop-dist-2.6.3-javadoc.jar
drwxr-xr-x. 2 root root 4096 Jan 19 13:13 javadoc-bundle-options
drwxr-xr-x. 2 root root 4096 Jan 19 13:13 maven-archiver
drwxr-xr-x. 2 root root 4096 Jan 19 13:13 test-dir
复制到安装目录:
[root@node1 target]# cp -r hadoop-2.6.3 /usr/local/
[root@node1 target]# cd /usr/local/hadoop-2.6.3
编辑hadoop-env.sh文件
[root@node1 hadoop-2.6.3]# vi etc/hadoop/hadoop-env.sh
修改:

export JAVA_HOME=JAVA_HOME=/usr/local/jdk1.7.0_79

添加:
export HADOOP_PREFIX=/usr/local/hadoop-2.6.3

然后测试:

[root@node1 hadoop-2.6.3]# bin/hadoop version
Hadoop 2.6.3
Subversion Unknown -r Unknown
Compiled by root on 2016-01-19T04:56Z
Compiled with protoc 2.5.0
From source with checksum 722f77f825e326e13a86ff62b34ada
This command was run using /usr/local/hadoop-2.6.3/share/hadoop/common/hadoop-common-2.6.3.jar

测试成功!

三、测试

$ mkdir input
$ cp etc/hadoop/*.xml input
$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.3.jar grep input output 'dfs[a-z.]+'
$ cat output/*


如果报INFO metrics.MetricsUtil: Unable to obtain hostName:node1错
修改/etc/hosts文件,加入本机的别名node1
127.0.0.1 localhost localhost.localdomain localhost修改为
127.0.0.1 localhost.localdomain node1

来自为知笔记(Wiz)
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: