Spark1.4.1 编译与安装
2016-01-10 10:02
337 查看
1、下载下载地址:
http://spark.apache.org/downloads.html
选择下载源码
![](http://www.aboutyun.com/data/attachment/forum/201507/28/184643t3paz4x6a9b034xe.jpg)
2、源码编译1)解压
tar -zxvf spark-1.4.1.tgz
2、编译
spark有三种编译方式
1.SBT编译
2.Maven编译
前提:1.JDK 2.Maven 3.Scala
mvn编译
mvn clean package \
-DskipTests -Phadoop-2.2 \
-Dhadoop.version=2.2.0 -Pyarn -Phive -Phive-thriftserver
3.生成部署包
make-distribution编译
./make-distribution.sh -tgz \
-Phadoop-2.2 -Dhadoop.version=2.2.0 \
-Pyarn \
-Phive-0.13.1 -Phive-thriftserver
进入根目录下,采用make-distribution.sh进行编译。
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
cd spark-1.4.1
sudo ./make-distribution.sh --tgz --skip-java-test -Pyarn -Phadoop-2.2-Dhadoop.version=2.2.0 -Phive -Phive-thriftserver -DskipTests clean package
如果中间有报错,请重新跑,多试几次,一般都能成功。
编译成功后,其安装文件在根目录下:
spark-1.4.1-bin-2.2.0.tgz
3、安装省略,和之前版本一样,就不写了。
4、报错问题集群启动时问题:
1)问题1 : worek节点不能启动
[align=left]localhost:starting org.apache.spark.deploy.worker.Worker, logging to/home/lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-is xxxx.out[/align]
[align=left]localhost:failed to launch org.apache.spark.deploy.worker.Worker:[/align]
[align=left]localhost: at org.apache.spark.launcher.SparkClassCommandBuilder.buildCommand(SparkClassCommandBuilder.java:98)[/align]
[align=left]localhost: atorg.apache.spark.launcher.Main.main(Main.java:74)[/align]
[align=left]localhost:full log in/home/lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-is xxxx.out[/align]
[align=left]localhost:Connection to localhost closed.[/align]
[align=left]原因是系统自带java问题[/align]
[align=left]rpm -qa | grep java[/align]
[align=left]gcc-java-4.4.7-4.el6.x86_64[/align]
[align=left]java_cup-0.10k-5.el6.x86_64[/align]
java-1.5.0-gcj-1.5.0.0-29.1.el6.x86_64
[align=left]卸载[/align]
[align=left]rpm -e --nodeps java_cup-0.10k-5.el6.x86_64[/align]
[align=left]rpm -e --nodepsjava-1.5.0-gcj-1.5.0.0-29.1.el6.x86_64[/align]
2)问题2 :JAVA_HOME is not set
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/lib/spark-1.4.1/sbin/../logs/spark-org.apache.spark.deploy.worker.Worker-1-is xxxx.out
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: JAVA_HOME is not set
localhost: full log in /lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-isxxxx.out
localhost: Connection to localhost closed.
[align=left]找到出错的shell文件,加入export JAVA_HOME=... 即可[/align]
[align=left]spark-env.sh,加入export JAVA_HOME=... 即可[/align]
启动成功后的界面:
http://spark.apache.org/downloads.html
选择下载源码
![](http://www.aboutyun.com/data/attachment/forum/201507/28/184643t3paz4x6a9b034xe.jpg)
2、源码编译1)解压
tar -zxvf spark-1.4.1.tgz
2、编译
spark有三种编译方式
1.SBT编译
2.Maven编译
前提:1.JDK 2.Maven 3.Scala
mvn编译
mvn clean package \
-DskipTests -Phadoop-2.2 \
-Dhadoop.version=2.2.0 -Pyarn -Phive -Phive-thriftserver
3.生成部署包
make-distribution编译
./make-distribution.sh -tgz \
-Phadoop-2.2 -Dhadoop.version=2.2.0 \
-Pyarn \
-Phive-0.13.1 -Phive-thriftserver
进入根目录下,采用make-distribution.sh进行编译。
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
cd spark-1.4.1
sudo ./make-distribution.sh --tgz --skip-java-test -Pyarn -Phadoop-2.2-Dhadoop.version=2.2.0 -Phive -Phive-thriftserver -DskipTests clean package
如果中间有报错,请重新跑,多试几次,一般都能成功。
编译成功后,其安装文件在根目录下:
spark-1.4.1-bin-2.2.0.tgz
3、安装省略,和之前版本一样,就不写了。
4、报错问题集群启动时问题:
1)问题1 : worek节点不能启动
[align=left]localhost:starting org.apache.spark.deploy.worker.Worker, logging to/home/lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-is xxxx.out[/align]
[align=left]localhost:failed to launch org.apache.spark.deploy.worker.Worker:[/align]
[align=left]localhost: at org.apache.spark.launcher.SparkClassCommandBuilder.buildCommand(SparkClassCommandBuilder.java:98)[/align]
[align=left]localhost: atorg.apache.spark.launcher.Main.main(Main.java:74)[/align]
[align=left]localhost:full log in/home/lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-is xxxx.out[/align]
[align=left]localhost:Connection to localhost closed.[/align]
[align=left]原因是系统自带java问题[/align]
[align=left]rpm -qa | grep java[/align]
[align=left]gcc-java-4.4.7-4.el6.x86_64[/align]
[align=left]java_cup-0.10k-5.el6.x86_64[/align]
java-1.5.0-gcj-1.5.0.0-29.1.el6.x86_64
[align=left]卸载[/align]
[align=left]rpm -e --nodeps java_cup-0.10k-5.el6.x86_64[/align]
[align=left]rpm -e --nodepsjava-1.5.0-gcj-1.5.0.0-29.1.el6.x86_64[/align]
2)问题2 :JAVA_HOME is not set
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /home/lib/spark-1.4.1/sbin/../logs/spark-org.apache.spark.deploy.worker.Worker-1-is xxxx.out
localhost: failed to launch org.apache.spark.deploy.worker.Worker:
localhost: JAVA_HOME is not set
localhost: full log in /lib/spark-1.4.1/sbin/../logs/org.apache.spark.deploy.worker.Worker-1-isxxxx.out
localhost: Connection to localhost closed.
[align=left]找到出错的shell文件,加入export JAVA_HOME=... 即可[/align]
[align=left]spark-env.sh,加入export JAVA_HOME=... 即可[/align]
启动成功后的界面:
![](http://www.aboutyun.com/data/attachment/forum/201507/28/184644xtw8hwjmtbetq4g8.png)
相关文章推荐
- Linux下btrfs子卷的挂载
- JavaWeb - Jsp页面属性简介 一
- 不用显示器使用树莓派——建立ssh
- 开发中的禁忌,过度思考!
- Java Swing JTable 表格【5:表格模型AbstractTableModel】
- 【QQ音乐Api】移花接木 打造自己的音乐电台
- C++中判断String是否包含某些特定字符
- 2^x mod n = 1 【杭电-HDOJ-1395】 附题
- IOS 移除storyboard
- 数据库重点知识总结
- 解决ftp不支持软连接
- C++开发人脸性别识别教程(6)——通过SVM实现性别识别
- Android制作粒子爆炸特效
- Unity之动态加载场景资源
- alos 数据下载
- VBA优秀资源
- 晶圆价格计算
- CentOS 5.X及6.X启动流程
- 网线水晶头接法图解
- C Primer Plus5-8