您的位置:首页 > 运维架构 > Apache

Integrating Apache Spark with PyCharm

2016-04-01 13:41 609 查看
参考

/Applications/PyCharm CE.app/Contents/bin


下写了一个pycharm.sh

export PYTHONPATH=/usr/local/share/spark1626/python/:/usr/local/share/spark1626/python/lib/py4j-0.9-src.zip
export SPARK_HOME=/usr/local/share/spark1626


没解决,不知道怎么回事

参考这篇评论

I have a better way:

Create a new Python virtual environment:

Go to PyCharm -> Preferences -> Project:

On the “Project Interpreter” Line, create a new virtual environment (Click on the gear icon on the right)

Once the virtual environment is created, go to the same menu, click “More” and see a list of all project interpreters. Make sure your virtual environment is selected

Click “Show paths for the selected interpreter” button at the bottom

On the next dialog click on the “+” icon to add paths. You will need to add SPARK_HOME/python and SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip

This should set you up to run and debug.

貌似还需要把SPARK_HOME添加到PYTHONHOME下
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: