美文网首页
Mac部署spark2.4.4

Mac部署spark2.4.4

作者: 程序员欣宸 | 来源:发表于2020-09-21 07:34 被阅读0次

    环境信息

    1. 操作系统:macOS Mojave 10.14.6
    2. JDK:1.8.0_211 (安装位置:/Library/Java/JavaVirtualMachines/jdk1.8.0_211.jdk/Contents/Home)

    前提条件

    请参考《Mac部署hadoop3(伪分布式)》一文,在Mac上事先部署好hadoop3

    部署步骤

    1. 安装scala:
    brew install scala
    
    1. 配置scala的环境变量,打开文件<font color="blue">~/.bash_profile</font>,增加以下配置内容:
    export SCALA_HOME=/usr/local/Cellar/scala/2.13.0
    export PATH=$PATH:$SCALA_HOME/bin
    
    1. 执行命令<font color="blue">source ~/.bash_profile</font>,再验证scala:
    base) zhaoqindeMBP:~ zhaoqin$ scala -version
    Scala code runner version 2.13.0 -- Copyright 2002-2019, LAMP/EPFL and Lightbend, Inc.
    
    1. 下载spark,地址是:http://spark.apache.org/downloads.html ,如下图红框:
      在这里插入图片描述
    2. 将下载的文件在<font color="blue">/usr/local/</font>目录下解压,并将文件夹名字从<font color="blue">spark-2.4.4-bin-hadoop2.7</font>改为<font color="blue">spark</font>
    3. 配置spark的环境变量,打开文件<font color="blue">~/.bash_profile</font>,增加以下配置内容:
    export SPARK_HOME=/usr/local/spark
    export PATH=$PATH:$SPARK_HOME/bin
    
    1. 执行命令<font color="blue">source ~/.bash_profile</font>使配置生效;
    2. 打开文件<font color="blue">spark/conf/spark-env.sh</font>,在尾部增加以下三行:
    export SCALA_HOME=/usr/local/Cellar/scala/2.13.0
    export SPARK_MASTER_IP=localhost
    export SPARK_WORKER_MEMORY=2G
    
    1. 确保hdfs和yarn已经启动,然后执行命令<font color="blue">spark-shell</font>,即可启动spark服务:
    To update your account to use zsh, please run `chsh -s /bin/zsh`.
    For more details, please visit https://support.apple.com/kb/HT208050.
    (base) zhaoqindeMBP:~ zhaoqin$ spark-shell
    19/10/27 13:33:51 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    Spark context Web UI available at http://zhaoqindembp:4040
    Spark context available as 'sc' (master = local[*], app id = local-1572154437623).
    Spark session available as 'spark'.
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 2.4.4
          /_/
    
    Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
    Type in expressions to have them evaluated.
    Type :help for more information.
    
    scala>
    

    至此,Mac机器上的hadoop和spark都运行起来了,希望本文能给您带来一些参考。

    https://github.com/zq2599/blog_demos

    相关文章

      网友评论

          本文标题:Mac部署spark2.4.4

          本文链接:https://www.haomeiwen.com/subject/yjbmyktx.html