最近工作需要,配置了下Scala环境。下面总结下我在配置过程中遇到的所有疑难杂症。
![](https://img.haomeiwen.com/i12567126/409447fa335873f3.jpg)
Step1 检查JDK安装
在cmd里输入Java –version
和javac –version
指令。一般java指令都没有问题。若报错:'javac' 不是内部或外部命令,也不是可运行的程序或批处理文件,这种一般都是环境变量的问题。
Step2 配置环境变量
以Win10为例,从控制面板-系统属性-高级系统设置-环境变量进入系统变量配置。
新建环境变量 CLASSPATH
变量值 .;%JAVA_HOME%\lib\dt.jar;%JAVA_HOME%\lib\tools.jar;
新建环境变量 JAVA_HOME
变量值 H:\Program Files\Java\jdk
新建环境变量 Path
变量值 C:\ProgramData\Oracle\Java\javapath;%java_home%\bin;%java_home%\jre\bin;
Step3
如果你跟我一样蛋疼,没有配置系统变量的权限(如图,系统变量的按键是灰色的)。那么也没关系,你需要下载一个系统变量编辑器。
下载修改windows系统环境变量的小工具RapidEE Portable。以管理员身份重启,菜单编辑---》添加环境变量---》添加值。
![](https://img.haomeiwen.com/i12567126/77dd05feabe5c999.jpg)
Step4 搭建Hadoop环境
测试一下Hadoop指令,我在这里遇到了,系统找不到指定路径的问题。查了一下,问题一般出自于jdk安装路径的问题,失败的原因是未监测到jdk安装路径。解决方法:将$HADOOP*HOME/etc/hadoop/hadoop-env.cmd文件中的set JAVA*HOME=C:Program FilesJavajdk1.8.0*161
修改为set JAVA*HOME=C:PROGRA~1Javajdk1.8.0_161
。参考文章https://juejin.im/post/5d4edfc351882511db0aa3e9
![](https://img.haomeiwen.com/i12567126/e9bd62bc98a30b39.png)
![](https://img.haomeiwen.com/i12567126/1f9c40745aea4b63.png)
Step5 安装JDK
安装地址:https://www.oracle.com/java/technologies/javase-downloads.html
Step6 安装Scala文件
![](https://img.haomeiwen.com/i12567126/6792da984f6a94b4.png)
![](https://img.haomeiwen.com/i12567126/d08a18b7ce24f8e9.png)
![](https://img.haomeiwen.com/i12567126/b269dfcb39f1ca5d.png)
![](https://img.haomeiwen.com/i12567126/682ddd6a02e4348a.png)
Step6 POM配置
<properties>
<spark.version>2.2.0</spark.version>
<scala.version>2.11</scala.version>
</properties>
dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope><
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_${scala.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<properties>
<spark.version>2.2.0</spark.version>
<scala.version>2.11</scala.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_${scala.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.ansj</groupId>
<artifactId>ansj_seg</artifactId>
<version>5.1.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<appendAssemblyId>false</appendAssemblyId>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
</plugins>
</build>
网友评论