1、vi ~/.bashrc
export PYSPARK_DRIVER_PYTHON=ipython
export PYSPARK_DRIVER_PYTHON_OPTS="notebook"
source ~/.bashrc
2、vi pyspark
把python改成ipython

3、jupyter notebook --generate-config

vi /data1/user/zhanghd/.jupyter/jupyter_notebook_config.py
修改地址为*

4、然后启动pyspark,访问那个url

5、本地访问这个地方,spark读取hdfs文件

网友评论