美文网首页
spark cassandra setup

spark cassandra setup

作者: lingo_xp | 来源:发表于2018-03-06 13:09 被阅读31次

這幾天在做數據轉移,打算把mysql上的數據都轉移到cassandra上,遇到不少坑。稍微記錄下。首先是環境配置,我是把spark 裝在/opt/目錄下的。

#install spark
sudo mkdir /opt/spark
wget http://ftp.jaist.ac.jp/pub/apache/spark/spark-2.2.1/spark-2.2.1-bin-hadoop2.7.tgz | sudo tar xvz -C /opt/
sudo mv /opt/spark-2.2.1-bin-hadoop2.7/* /opt/spark/
sudo rm -rf /opt/spark-2.2.1-bin-hadoop2.7/
#install cassandra
echo "deb http://www.apache.org/dist/cassandra/debian 311x main" | sudo tee -a /etc/apt/sources.list.d/cassandra.sources.list
sudo curl https://www.apache.org/dist/cassandra/KEYS | sudo apt-key add -
sudo apt-get update
sudo apt-get install cassandra
sudo chmod -R 777 /var/log/cassandra/
sudo chmod -R 777 /var/lib/cassandra/

# install spark extra libs
wget http://central.maven.org/maven2/com/twitter/jsr166e/1.1.0/jsr166e-1.1.0.jar
sudo mv jsr166e-1.1.0.jar /opt/spark/jars/

wget http://central.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.11/2.0.7/spark-cassandra-connector_2.11-2.0.7.jar
sudo mv spark-cassandra-connector_2.11-2.0.7.jar /opt/spark/jars/

這裏也可以不把這些庫裝在這裏,submit 時候提交這些jar也是ok的,如果你不是只在本機運算的話,請一定在submit的時候添加你用的庫。
之後是在intellij裏新建maven project, scala更新到了2.12,但是其他庫目前只支持到2.11.8, 所以還是裝的2.11.8. pom裏一定要加上mavern-shade-plugin, intellij自帶的打包總是會遇到一些classnotfound exception, pom 文件我也貼出來了。

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>your groupID</groupId>
    <artifactId>your arifactID</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <maven.compiler.source>1.8</maven.compiler.source>
        <maven.compiler.target>1.8</maven.compiler.target>
        <encoding>UTF-8</encoding>
        <scala.version>2.11.8</scala.version>
        <scala.compat.version>2.11</scala.compat.version>
    </properties>

    <dependencies>
        <!-- https://mvnrepository.com/artifact/com.google.guava/guava -->
        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>15.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.2.1</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
        <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.2.1</version>
        <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
            <version>1.7.5</version>
        </dependency>
        <dependency>
            <groupId>org.clapper</groupId>
            <artifactId>grizzled-slf4j_2.11</artifactId>
            <version>1.3.1</version>
        </dependency>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.11.8</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.cassandra</groupId>
            <artifactId>cassandra-driver-core</artifactId>
            <version>3.4.0</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.11</artifactId>
            <version>2.0.7</version>
        </dependency>
    </dependencies>
    <build>
        <sourceDirectory>src/main/scala</sourceDirectory>
        <testSourceDirectory>src/test/scala</testSourceDirectory>
        <plugins>
            <plugin>
                <!-- see http://davidb.github.com/scala-maven-plugin -->
                <groupId>net.alchim31.maven</groupId>
                <artifactId>scala-maven-plugin</artifactId>
                <version>3.3.1</version>
                <executions>
                    <execution>
                        <goals>
                            <goal>compile</goal>
                            <goal>testCompile</goal>
                        </goals>
                        <configuration>
                            <args>
                                <arg>-feature</arg>
                                <arg>-deprecation</arg>
                                <arg>-dependencyfile</arg>
                                <arg>${project.build.directory}/.scala_dependencies</arg>
                            </args>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.18.1</version>
                <configuration>
                    <useFile>false</useFile>
                    <disableXmlReport>true</disableXmlReport>
                    <!-- If you have classpath issue like NoDefClassError,... -->
                    <!-- useManifestOnlyJar>false</useManifestOnlyJar -->
                    <includes>
                        <include>**/*Test.*</include>
                        <include>**/*Suite.*</include>
                    </includes>
                </configuration>
            </plugin>
            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>exec-maven-plugin</artifactId>
                <version>1.5.0</version>
                <executions>
                    <execution>
                        <id>run-local</id>
                        <goals>
                            <goal>exec</goal>
                        </goals>
                        <configuration>
                            <executable>spark-submit</executable>
                            <arguments>
                                <argument>--master</argument>
                                <argument>local</argument>
                                <argument>${project.build.directory}/${project.artifactId}-${project.version}-uber.jar</argument>
                            </arguments>
                        </configuration>
                    </execution>
                    <execution>
                        <id>run-yarn</id>
                        <goals>
                            <goal>exec</goal>
                        </goals>
                        <configuration>
                            <environmentVariables>
                                <HADOOP_CONF_DIR>
                                    ${basedir}/spark-remote/conf
                                </HADOOP_CONF_DIR>
                            </environmentVariables>
                            <executable>spark-submit</executable>
                            <arguments>
                                <argument>--master</argument>
                                <argument>yarn</argument>
                                <argument>${project.build.directory}/${project.artifactId}-${project.version}-uber.jar</argument>
                            </arguments>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <!-- Use the shade plugin to remove all the provided artifacts (such as spark itself) from the uber jar -->
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>3.1.0</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                        <configuration>
                            <!-- Remove signed keys to prevent security exceptions on uber jar -->
                            <!-- See https://stackoverflow.com/a/6743609/7245239 -->
                            <filters>
                                <filter>
                                    <artifact>*:*</artifact>
                                    <excludes>
                                        <exclude>META-INF/*.SF</exclude>
                                        <exclude>META-INF/*.DSA</exclude>
                                        <exclude>META-INF/*.RSA</exclude>
                                    </excludes>
                                </filter>
                            </filters>
                            <transformers>
                                <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                    <manifestEntries>
                                        <Main-Class>net.martinprobson.spark.spark_example.SparkTest</Main-Class>
                                    </manifestEntries>
                                </transformer>
                            </transformers>
                            <artifactSet>
                                <excludes>
                                    <exclude>javax.servlet:*</exclude>
                                    <exclude>org.apache.hadoop:*</exclude>
                                    <exclude>org.apache.maven.plugins:*</exclude>
                                    <exclude>org.apache.spark:*</exclude>
                                    <exclude>org.apache.avro:*</exclude>
                                    <exclude>org.apache.parquet:*</exclude>
                                    <exclude>org.scala-lang:*</exclude>
                                </excludes>
                            </artifactSet>
                            <finalName>${project.artifactId}-${project.version}-uber</finalName>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</project>

之後做個簡單的讀取和寫入測試就行, 寫完code用mvn packge 命令打包,生成的jar在target文件夾下, 然後用spark-submit提交。

相关文章

网友评论

      本文标题:spark cassandra setup

      本文链接:https://www.haomeiwen.com/subject/jfblfftx.html