1.安装jdk
下载安装包
https://www.oracle.com/java/technologies/javase-jdk13-downloads.html
![](https://img.haomeiwen.com/i13751190/c1c5c774da907537.png)
安装:
从finder中找到下载的jdk,按照提示安装即可
java -version
# 结果中出现 java version "13.0.2" 即为安装成功
2.安装spark
下载安装包
![](https://img.haomeiwen.com/i13751190/273acd3caec44fb4.png)
安装步骤:
cd /usr/local
mv ~/Downloads/spark-3.0.0-preview2-bin-hadoop2.7.tgz . # 下载spark在Downloads目录中
tar -zxvf spark-3.0.0-preview2-bin-hadoop2.7.tgz
vim ~/.bash_profile
# bash_profile中添加如下配置
export SPARK_HOME=/usr/local/spark-3.0.0-preview2-bin-hadoop2.7
export PATH=$PATH:$SPARK_HOME/bin
export PYSPARK_PYTHON=python3
#
source ~/.bash_profile
pip install pyspark # 安装pyspark
验证:
命令行输入 pyspark即可验证
网友评论