hadoop-2.4.0源码编译过程
2015-04-19 12:03
399 查看
系统为ubuntu14.04,32bit,以前一直用官网包(官网为32bit),这次试着自己编译了一次,大致如下:
1.下载hadoop-2.4.0-src.tar.gz源码包
官网下载地址:http://hadoop.apache.org/
网盘下载地址:http://pan.baidu.com/s/1pJ7gdMN
下载完成解压,得到hadoop源码文件夹:hadoop-2.4.0-src
2.安装编译所需的软件:
1).jdk1.7安装(编译时切记不要用jdk1.8):
官网下载地址:http://www.oracle.com/technetwork/java/javase/downloads/index.html
网盘下载地址:http://pan.baidu.com/s/1gd9yrb9
下载之后,解压安装至/usr/local/目录,并配置环境变量:
export JAVA_HOME=/usr/local/jdk1.7.0_79
expoort JRE_HOME=/usr/local/jdk1.7.9_79/jre
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
然后退出,source
查看安装是否成功:
java -version
2).maven安装
官网下载地址:http://maven.apache.org/download.cgi
网盘下载地址:http://pan.baidu.com/s/1ntIF1S9
下载完成后解压得到apache-maven-3.2.5,安装至/usr目录下,并配置环境变量:
export MAVEN_HOME=/usr/apache-maven-3.2.5
export MAVEN_OPTS="-Xms128m -Xmx512m"
export PATH=$MAVEN_HOME/bin:$PATH
然后退出,source
验证安装是否成功:
mvn -v
3).Ant 安装
官网下载地址:http://ant.apache.org/bindownload.cgi
网盘下载地址:http://pan.baidu.com/s/1bnASwJd
下载完成后解压得到apache-ant-1.9.4,安装至/usr目录下,配置环境变量:
export ANT_HOME=/usr/apache-ant-1.9.4
export PATH=$ANT_HOME/bin:$PATH
然后退出,source
验证安装是否成功:
ant -version
4).g++安装
sudo apt-get install g++
5).protobuf安装(hadoop2.4.0适用protobuf2.5.0版本):
下载地址:http://pan.baidu.com/s/1jGotvQA
tar xzf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
sudo ./configure --prefix=/usr/protobuf(指定的安装目录)或者 在所安装的文件夹中sudo ./configure
sudo make
sudo make install
sudo ldconfig
配置环境变量:
export PROTOC_HOME=/usr/protobuf
export PATH=$PROTOC_HOME/bin:$PATH
export LD_LIBRARY_PATH=/usr/protobuf-2.5.0
然后退出,source
验证安装是否成功:
hadoop@master:~$ protoc --version
libprotoc 2.5.0
hadoop@master:~$
6).cmake安装
下载地址:http://pan.baidu.com/s/1sj7cadf
tar xzf cmake-2.8.12.2.tar.gz
cd cmake-2.8.12.2
sudo ./bootstrap --prefix=/usr/cmake(指定的安装目录)
sudo make
sudo make install
配置环境变量:
export CMAKE_HOME=/usr/cmake
export PATH=$CMAKE_HOME/bin:$PATH
7).openssl库安装
sudo apt-get install libssl-dev
8).libglib2.0-dev安装
sudo apt-get install libglib2.0-dev
9).libssl-dev安装
sudo apt-get install libssl-dev
3.编译hadoop-2.4.0
cd /hadoop-2.4.0-src
mvn package -Pdist -DskipTests -Dtar
注意:hadoop-2.4.0-src权限设置:
sudo chown -R hadoop:hadoop hadoop-2.4.0-src
编译完成后:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 4.940 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 2.429 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 3.136 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.521 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 2.996 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 4.248 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 2.875 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 2.447 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 2.249 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:41 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 5.997 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.085 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [04:11 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 19.371 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 31.596 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 4.239 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.054 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.073 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:43 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 52.105 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.114 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 5.712 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 56.731 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 1.851 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 14.598 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 10.669 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 2.724 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 3.387 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.290 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 2.537 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 1.601 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.116 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 9.501 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.075 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 20.997 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 15.518 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 1.601 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 5.851 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 4.736 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 16.372 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 1.167 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 3.231 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 5.312 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 8.653 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 23.317 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 1.464 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 3.259 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 2.664 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 1.524 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 1.646 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 0.035 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 3.909 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 8.041 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.147 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 13.277 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 6.222 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.024 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 39.922 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:00 min
[INFO] Finished at: 2015-04-19T10:52:55+08:00
[INFO] Final Memory: 126M/391M
[INFO] ------------------------------------------------------------------------
hadoop@master:~/hadoop-2.4.0-src$
在目录~/hadoop-2.4.0-src/hadoop-dist/target下有文件:hadoop-2.4.0.tar.gz,就是我们需要的文件。
当然这次编译遇到了许多问题,时间原因只记录了几个,详见我的另一博客:hadoop2.4.0源码编译问题
1.下载hadoop-2.4.0-src.tar.gz源码包
官网下载地址:http://hadoop.apache.org/
网盘下载地址:http://pan.baidu.com/s/1pJ7gdMN
下载完成解压,得到hadoop源码文件夹:hadoop-2.4.0-src
2.安装编译所需的软件:
1).jdk1.7安装(编译时切记不要用jdk1.8):
官网下载地址:http://www.oracle.com/technetwork/java/javase/downloads/index.html
网盘下载地址:http://pan.baidu.com/s/1gd9yrb9
下载之后,解压安装至/usr/local/目录,并配置环境变量:
export JAVA_HOME=/usr/local/jdk1.7.0_79
expoort JRE_HOME=/usr/local/jdk1.7.9_79/jre
export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH
然后退出,source
查看安装是否成功:
java -version
2).maven安装
官网下载地址:http://maven.apache.org/download.cgi
网盘下载地址:http://pan.baidu.com/s/1ntIF1S9
下载完成后解压得到apache-maven-3.2.5,安装至/usr目录下,并配置环境变量:
export MAVEN_HOME=/usr/apache-maven-3.2.5
export MAVEN_OPTS="-Xms128m -Xmx512m"
export PATH=$MAVEN_HOME/bin:$PATH
然后退出,source
验证安装是否成功:
mvn -v
3).Ant 安装
官网下载地址:http://ant.apache.org/bindownload.cgi
网盘下载地址:http://pan.baidu.com/s/1bnASwJd
下载完成后解压得到apache-ant-1.9.4,安装至/usr目录下,配置环境变量:
export ANT_HOME=/usr/apache-ant-1.9.4
export PATH=$ANT_HOME/bin:$PATH
然后退出,source
验证安装是否成功:
ant -version
4).g++安装
sudo apt-get install g++
5).protobuf安装(hadoop2.4.0适用protobuf2.5.0版本):
下载地址:http://pan.baidu.com/s/1jGotvQA
tar xzf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0
sudo ./configure --prefix=/usr/protobuf(指定的安装目录)或者 在所安装的文件夹中sudo ./configure
sudo make
sudo make install
sudo ldconfig
配置环境变量:
export PROTOC_HOME=/usr/protobuf
export PATH=$PROTOC_HOME/bin:$PATH
export LD_LIBRARY_PATH=/usr/protobuf-2.5.0
然后退出,source
验证安装是否成功:
hadoop@master:~$ protoc --version
libprotoc 2.5.0
hadoop@master:~$
6).cmake安装
下载地址:http://pan.baidu.com/s/1sj7cadf
tar xzf cmake-2.8.12.2.tar.gz
cd cmake-2.8.12.2
sudo ./bootstrap --prefix=/usr/cmake(指定的安装目录)
sudo make
sudo make install
配置环境变量:
export CMAKE_HOME=/usr/cmake
export PATH=$CMAKE_HOME/bin:$PATH
7).openssl库安装
sudo apt-get install libssl-dev
8).libglib2.0-dev安装
sudo apt-get install libglib2.0-dev
9).libssl-dev安装
sudo apt-get install libssl-dev
3.编译hadoop-2.4.0
cd /hadoop-2.4.0-src
mvn package -Pdist -DskipTests -Dtar
注意:hadoop-2.4.0-src权限设置:
sudo chown -R hadoop:hadoop hadoop-2.4.0-src
编译完成后:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 4.940 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 2.429 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 3.136 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.521 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 2.996 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 4.248 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 2.875 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 2.447 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 2.249 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:41 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 5.997 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.085 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [04:11 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 19.371 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 31.596 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 4.239 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.054 s]
[INFO] hadoop-yarn ........................................ SUCCESS [ 0.073 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:43 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 52.105 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [ 0.114 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 5.712 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 56.731 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 1.851 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 14.598 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 10.669 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 2.724 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [ 3.387 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.290 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 2.537 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 1.601 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [ 0.116 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 9.501 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.075 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 20.997 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 15.518 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 1.601 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 5.851 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 4.736 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 16.372 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 1.167 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 3.231 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [ 5.312 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 8.653 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 23.317 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 1.464 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [ 3.259 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 2.664 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [ 1.524 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [ 1.646 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 0.035 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 3.909 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 8.041 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.147 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 13.277 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 6.222 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.024 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 39.922 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:00 min
[INFO] Finished at: 2015-04-19T10:52:55+08:00
[INFO] Final Memory: 126M/391M
[INFO] ------------------------------------------------------------------------
hadoop@master:~/hadoop-2.4.0-src$
在目录~/hadoop-2.4.0-src/hadoop-dist/target下有文件:hadoop-2.4.0.tar.gz,就是我们需要的文件。
当然这次编译遇到了许多问题,时间原因只记录了几个,详见我的另一博客:hadoop2.4.0源码编译问题
相关文章推荐
- linux环境下,hadoop2.4.0源码编译
- Hadoop-2.8.0集群搭建、hadoop源码编译和安装、host配置、ssh免密登录、hadoop配置文件中的参数配置参数总结、hadoop集群测试,安装过程中的常见错误
- 【hadoop 2.6】hadoop 2.6源码编译过程,redhat 5.8操作系统进行编译【附:软件下载】
- hadoop2.4.0源码编译
- hadoop2.4.0源码编译
- 修改hadoop源码后,hadoop和spark的编译过程
- 编译hadoop-2.5.2的源码过程
- hadoop2.4.0源码编译问题
- hadoop2.4.1源码在64位系统编译过程中遇到的几个错误及解决方法
- hadoop2.4.0在macos下编译源码
- 修改hadoop源码后,hadoop和spark的编译过程
- 编译hadoop-2.4.0之HDFS的64位C++库
- Linux64位操作系统(CentOS6.6)上如何编译hadoop2.4.0
- Hadoop,HBase,Zookeeper源码编译并导入eclipse
- Android编译源码过程和重点
- centos编译hadoop-3.0.0-beta1源码
- CentOS6.5环境下编译hadoop2.8.1源码
- 重返德军总部wolf3d iphone源码编译过程
- Ubuntu 9.04下编译Android源码的过程
- 简单记录全志平台编译android源码过程