1.spark单机安装

1>.下载spark,解压

2.配置环境变量~/.bashrc

export SPARK_HOME=/home/cpp/software/spark
#export HADOOP_CONF_DIR=
#export YARN_CONF_DIR=
export PATH=”/opt/anaconda3/bin:$PATH:/home/cpp/software/spark/bin”

export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.10.4-src.zip:$PYTHONPATH

export PYSPARK_PYTHON=python3

3.pyspark  运行测试

4.python>import pyspark测试

5.pycharm   (.py)

import sys
sys.path.append(“/home/cpp/software/spark/python”)
import os
os.environ[‘JAVA_HOME’]=’/usr/java/jdk1.7.0_60′

发表评论

邮箱地址不会被公开。 必填项已用*标注