You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# For a ipython notebook and pyspark integration
if which pyspark > /dev/null; then
#/usr/local/Cellar/apache-spark/1.5.1
export SPARK_HOME="/usr/local/Cellar/apache-spark/1.6.1/libexec/"
export PYSPARK_SUBMIT_ARGS="--master local[2]"
fi
export SPARK_HOME="/usr/local/Cellar/apache-spark/1.6.1/libexec"
export PYTHONPATH=$SPARK_HOME/python3/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python3/lib/py4j-0.9-src.zip:$PYTHONPATH
#export PYSPARK_PYTHON=python3
Note that I am working with a Jupyter notebook:
In:
from splearn.svm import SparkLinearSVC
spark=SparkLinearSVC()
During execution of following simple code with Sparkit-Learn:
I get following error message:
In accordance with those anserws:
http://stackoverflow.com/questions/28829757/unable-to-add-spark-to-pythonpath
http://stackoverflow.com/questions/23256536/importing-pyspark-in-python-shell
I have added every possible configuration of those PYTHONPATHs to my .bashrc,but error is still occuring.
Currently my .bashrc paths looks like that:
Any possible solution? I am running this on Ubuntu 16.04,Pycharm and spark-1.6.1-bin-hadoop2.6
The text was updated successfully, but these errors were encountered: