spark for python : linux
2017-01-10 11:12
316 查看
ubuntu 14.04
Download Softwares[ if web connection refused, go tohttp://blog.csdn.net/houxiaoqin/article/details/54096175
]:
Anaconda2-4.2.0-Linux-x86_64.sh 【http://continuum.io/downloads#all】
jdk-8u111-linux-x64.tar.gz 【http://continuum.io/downloads#all】
spark-1.5.2-bin-hadoop2.6.tgz 【http://spark.apache.org/downloads】
1. Installing Java 8
cd Downloads
ls
mkdir -p /usr/lib/jvm
sudo mv jdk-8u111-linux-x64.tar.gz /usr/lib/jvm
cd /usr/lib/jvm
sudo tar xzvf jdk-8u11-linux-i586.tar.gz
sudo ln -s jdk1.8.0_111 java-8
设置环境变量
vi ~/.bashrc
在文件末尾追加:
export JAVA_HOME=/usr/lib/jvm/java-8
export JAVA_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:$PATH
:wq保存退出
source ~/.bashrc 使上述修改生效
java-version
【注意::$PATH不写、写错或者写成小写,则原始路径直接被覆盖,导致系统找不到原始路径,ls等基础命令失效】
【如果不小心覆盖了原始路径,命令行输入export PATH=/usr/bin:/bin 然后输入 vi ~/.bashrc 找错改正】
更新默认jdk
# update-alternatives --install /usr/bin/java java /usr/lib/jvm/jdk1.8.0_05/bin/java 300
# update-alternatives --install /usr/bin/javac javac /usr/lib/jvm/jdk1.8.0_05/bin/javac 300
# update-alternatives --config java
2. Installing Anaconda with Python 2.7
命令:bash Anaconda2-4.2.0-Linux-x86_64.sh
3. Installing Spark
cd Downloads
ls
mkdir -p /home/用户名/spark
tar -xf spark-1.5.2-bin-hadoop2.6.tgz
sudo mv spark-1.5.2-bin-hadoop2.6 /home/用户名/spark
cd /home/用户名/spark/spark-1.5.2-bin-hadoop2.6
./bin/pyspark
4. Enabling Jupyter Notebook
命令:jupyter notebook
connect Jupyter to Spark:
命令:PYSPARK_DRIVER_PYTHON=jupyter PYSPARK_DRIVER_PYTHON_OPTS="notebook" $SPARK_HOME/bin/pyspark --master local[*]
Where $SPARK_HOME: environment variable set to the Spark home directory
https://developer.ibm.com/hadoop/2016/05/04/install-jupyter-notebook-spark/
https://jupyter.readthedocs.io/en/latest/install.html
http://blog.csdn.net/tina_ttl/article/details/51031113
5. Virtualizing the environment with Docker
$ sudo apt-get update
$ sudo apt-get install curl
$ curl -fsSL https://get.docker.com/ | sh
https://docs.docker.com/engine/getstarted/linux_install_help/
如果报错dpkg: error processing package oracle-java8-installer (--configure), 很可能/var/cache/oracle-jdk8-installer/jdk-8u111-linux-x64.tar.gz残缺【ls -lht查看】。
sudo mv /usr/lib/jvm/jdk-8u111-linux-x64.tar.gz /var/cache/oracle-jdk8-installer/
curl -fsSL https://get.docker.com/ | sh
sudo docker version
https://marketplace.eclipse.org/marketplace-client-intro?mpc_install=421
Download Softwares[ if web connection refused, go tohttp://blog.csdn.net/houxiaoqin/article/details/54096175
]:
Anaconda2-4.2.0-Linux-x86_64.sh 【http://continuum.io/downloads#all】
jdk-8u111-linux-x64.tar.gz 【http://continuum.io/downloads#all】
spark-1.5.2-bin-hadoop2.6.tgz 【http://spark.apache.org/downloads】
1. Installing Java 8
cd Downloads
ls
mkdir -p /usr/lib/jvm
sudo mv jdk-8u111-linux-x64.tar.gz /usr/lib/jvm
cd /usr/lib/jvm
sudo tar xzvf jdk-8u11-linux-i586.tar.gz
sudo ln -s jdk1.8.0_111 java-8
设置环境变量
vi ~/.bashrc
在文件末尾追加:
export JAVA_HOME=/usr/lib/jvm/java-8
export JAVA_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:$PATH
:wq保存退出
source ~/.bashrc 使上述修改生效
java-version
【注意::$PATH不写、写错或者写成小写,则原始路径直接被覆盖,导致系统找不到原始路径,ls等基础命令失效】
【如果不小心覆盖了原始路径,命令行输入export PATH=/usr/bin:/bin 然后输入 vi ~/.bashrc 找错改正】
更新默认jdk
# update-alternatives --install /usr/bin/java java /usr/lib/jvm/jdk1.8.0_05/bin/java 300
# update-alternatives --install /usr/bin/javac javac /usr/lib/jvm/jdk1.8.0_05/bin/javac 300
# update-alternatives --config java
2. Installing Anaconda with Python 2.7
命令:bash Anaconda2-4.2.0-Linux-x86_64.sh
3. Installing Spark
cd Downloads
ls
mkdir -p /home/用户名/spark
tar -xf spark-1.5.2-bin-hadoop2.6.tgz
sudo mv spark-1.5.2-bin-hadoop2.6 /home/用户名/spark
cd /home/用户名/spark/spark-1.5.2-bin-hadoop2.6
./bin/pyspark
4. Enabling Jupyter Notebook
命令:jupyter notebook
connect Jupyter to Spark:
命令:PYSPARK_DRIVER_PYTHON=jupyter PYSPARK_DRIVER_PYTHON_OPTS="notebook" $SPARK_HOME/bin/pyspark --master local[*]
Where $SPARK_HOME: environment variable set to the Spark home directory
https://developer.ibm.com/hadoop/2016/05/04/install-jupyter-notebook-spark/
https://jupyter.readthedocs.io/en/latest/install.html
http://blog.csdn.net/tina_ttl/article/details/51031113
5. Virtualizing the environment with Docker
$ sudo apt-get update
$ sudo apt-get install curl
$ curl -fsSL https://get.docker.com/ | sh
https://docs.docker.com/engine/getstarted/linux_install_help/
如果报错dpkg: error processing package oracle-java8-installer (--configure), 很可能/var/cache/oracle-jdk8-installer/jdk-8u111-linux-x64.tar.gz残缺【ls -lht查看】。
sudo mv /usr/lib/jvm/jdk-8u111-linux-x64.tar.gz /var/cache/oracle-jdk8-installer/
curl -fsSL https://get.docker.com/ | sh
sudo docker version
https://marketplace.eclipse.org/marketplace-client-intro?mpc_install=421
相关文章推荐
- Cross Compiling Python for Embedded Linux
- python for ARM/LINUX
- 地铁译:Spark for python developers --- 搭建Spark虚拟环境 4
- 地铁译:Spark for python developers ---Spark流式数据处理
- python MySQLdb模块 for linux安装
- 新书推介《Python.for.Unix.and.Linux.System.Administration》
- Linux下Spark框架配置(Python)
- 地铁译:Spark for python developers --- 搭建Spark虚拟环境2
- 地铁译:Spark for python developers --- 搭建Spark虚拟环境1
- 初学mysql & python & python链接数据库(mysql) & mysql for linux
- Python library for Linux process management - Stack Overflow
- dotfiles for linux/unix users automatically! (python Vim IDE)
- Code a network packet sniffer in python for Linux
- 地铁译:Spark for python developers --- 搭建Spark虚拟环境3
- 配置pyqt5环境 for python3.4 on Linux Mint 17.1
- Cross Compiling Python for Embedded Linux
- Code a network packet sniffer in python for Linux
- Deploying OpenFire for IM (instant message) service (TCP/IP service) with database MySQL , client Spark on linux部署OpenFire IM 消息中间件服务
- 地铁译:Spark for python developers ---Spark的数据戏法
- Linux下Spark框架配置(Python)