美文网首页
Mac安装Scala+Spark+SBT环境

Mac安装Scala+Spark+SBT环境

作者: 梦想又照进现实 | 来源:发表于2019-12-22 13:51 被阅读0次

安装Scala
安装Spark
验证Demo脚本

安装Scala

1、brew安装Scala
brew install scala

2、配置环境变量
vi ~/.bash_profile
新增定义:
export SCALA_HOME=/usr/local/Cellar/scala/2.13.0
export PATH="PATH:SCALA_HOME/bin"
生效新定义:
source ~/.bash_profile

3、验证Scala

appledeMacBook-Air:bin apple$ scala
Welcome to Scala 2.13.0 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181).
Type in expressions for evaluation. Or try :help.

scala> var data=Array(1,2,3,4,5)
data: Array[Int] = Array(1, 2, 3, 4, 5)

安装Spark

1、使用brew安装
$ brew install apache-spark

1.2、报错了

appledeMacBook-Air:home apple$ brew install apache-spark
==> Downloading https://www.apache.org/dyn/closer.lua?path=spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz
==> Downloading from http://mirror.bit.edu.cn/apache/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz

curl: (22) The requested URL returned error: 404 Not Found
Trying a mirror...
==> Downloading https://www-eu.apache.org/dist/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz

curl: (22) The requested URL returned error: 404 Not Found
Trying a mirror...
==> Downloading https://www-us.apache.org/dist/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz

curl: (22) The requested URL returned error: 404 Not Found
Error: An exception occurred within a child process:
  DownloadError: Failed to download resource "apache-spark"
Download failed: https://www-us.apache.org/dist/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz

1.3、到官网手工下载
解压后将文件夹放到合适的位置
mv /Users/apple/Downloads/spark-2.4.4-bin-hadoop2.7 /usr/local/Cellar/apache-spark

2、启动
cd /usr/local/Cellar/apache-spark/spark-2.4.4-bin-hadoop2.7/bin
./spark-shell
启动成功界面


1576993257638.jpg

安装SBT

1、Brew安装
appledeMacBook-Air:bin apple$ brew install sbt

==> Downloading https://sbt-downloads.cdnedge.bluemix.net/releases/v1.2.8/sbt-1.2.8.tgz
######################################################################## 100.0%
==> Caveats
You can use $SBT_OPTS to pass additional JVM options to sbt.
Project specific options should be placed in .sbtopts in the root of your project.
Global settings should be placed in /usr/local/etc/sbtopts
==> Summary
🍺  /usr/local/Cellar/sbt/1.2.8: 521 files, 50MB, built in 18 minutes 29 seconds

2、修改环境变量 /ect/profile
vi ~/.bash_profile
新增定义:
export SBT_HOME=/usr/local/Cellar/sbt/1.2.8
export PATH=PATH:SBT_HOME/bin
生效新定义:
source ~/.bash_profile

3、验证版本
apple$ sbt sbt-version

编写Demo脚本

1、在spark-shell中完成单词统计:

sc.textFile("/usr/local/Cellar/apache-spark/spark-2.4.4-bin-hadoop2.7/README.md")
.flatMap(line => line.split(" "))
.map(w => (w, 1))
.reduceByKey(_+_)
.foreach(println)

2、map操作,指定一个函数产生新的RDD,元素直接是一对一的关系

scala> var rdd1 = sc.parallelize(1 to 9,3)
scala> var rdd2 = rdd1.map(x=>x*2)
scala> rdd2.collect
res9: Array[Int] = Array(2, 4, 6, 8, 10, 12, 14, 16, 18)

相关文章

网友评论

      本文标题:Mac安装Scala+Spark+SBT环境

      本文链接:https://www.haomeiwen.com/subject/aironctx.html