美文网首页
51cto赵强HADOOP学习(十五)sqoop的导入导出

51cto赵强HADOOP学习(十五)sqoop的导入导出

作者: lehuai | 来源:发表于2017-12-19 21:04 被阅读0次

    使用sqoop导入关系型数据库中的数据

    安装

    #cd training
    # tar -zxvf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz
    # rm -rf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz
    # mv sqoop-1.4.6.bin__hadoop-2.0.4-alpha/ sqoop
    #cd sqoop
    #vi /etc/profile
    export SQOOP_HOME=/root/training/sqoop
    
    export PATH=$PATH:$SQOOP_HOME/bin
    #source /etc/profile
    #cd conf
    #cp sqoop-env-template.sh sqoop-env.sh
    #vi sqoop-env.sh
    export HADOOP_COMMON_HOME=/usr/local/hadoop
    export HADOOP_MAPRED_HOME=/usr/local/hadoop
    
    #cd ../..
    # tar -zxvf mysql-connector-java-5.1.44.tar.gz
    # cd mysql-connector-java-5.1.44 &&ls
    # cp mysql-connector-java-5.1.44-bin.jar /root/training/sqoop/lib
    #cd ../..
    #sqoop-version
    

    导入导出

    使用Sqoop导入mysql数据到HDFS中
    ./sqoop import --connect jdbc:oracle:thin:@192.168.56.101:1521:orcl --username scott --password tiger --table emp --columns 'empno,ename,job,sal,deptno' -m 1 --target-dir '/sqoop/emp'
    #sqoop import --connect jdbc:mysql://192.168.116.27:3306/test --username root --password root --table emp --columns 'empno,ename,job,sal,deptno' -m 1 --target-dir '/sqoop/emp'
    #hdfs dfs -lsr /sqoop
    # hdfs dfs -cat /sqoop/emp/part-m-00000
    
    使用Sqoop导入mysql数据到Hive中
    ./sqoop import --hive-import --connect jdbc:oracle:thin:@192.168.56.101:1521:orcl --username scott --password tiger --table emp -m 1 --columns 'empno,ename,job,sal,deptno'
    
    sqoop import --hive-import --connect jdbc:mysql://192.168.116.27:3306/test --username root --password root --table emp -m 1 --columns 'empno,ename,job,sal,deptno'
    
    使用Sqoop导入mysql数据到Hive中,并且指定表名
    ./sqoop import --hive-import --connect jdbc:oracle:thin:@192.168.56.101:1521:orcl --username scott --password tiger --table emp -m 1 --columns 'EMPNO,ENAME,JOB,SAL,DEPTNO' --hive-table emp1
    
    sqoop import --hive-import --connect jdbc:mysql://192.168.116.27:3306/test --username root --password root --table emp -m 1 --columns 'empno,ename,job,sal,deptno' --hive-table emp1
    
    使用Sqoop导入mysql数据到Hive中,并使用where条件
    ./sqoop import --hive-import --connect jdbc:oracle:thin:@192.168.56.101:1521:orcl --username scott --password tiger --table emp -m 1 --columns 'EMPNO,ENAME,JOB,SAL,DEPTNO' --hive-table emp2 --where 'DEPTNO=10'
    
    sqoop import --hive-import --connect jdbc:mysql://192.168.116.27:3306/test --username root --password root --table emp -m 1 --columns 'empno,ename,job,sal,deptno' --hive-table emp2 --where 'deptno=10'
    
    
    使用Sqoop导入mysql数据到Hive中,并使用查询语句
    ./sqoop import --hive-import --connect jdbc:oracle:thin:@192.168.56.101:1521:orcl --username scott --password tiger -m 1 --query 'SELECT * FROM EMP WHERE SAL<2000 AND $CONDITIONS' --target-dir '/sqoop/emp5' --hive-table emp5
    
    sqoop import --hive-import --connect jdbc:mysql://192.168.116.27:3306/test --username root --password root -m 1 --query 'SELECT * FROM EMP WHERE SAL<2000 AND $CONDITIONS' --target-dir '/sqoop/emp5' --hive-table emp5 
    
    
    使用Sqoop将Hive中的数据导出到mysql中
    ./sqoop export --hive-import --connect jdbc:oracle:thin:@192.168.56.101:1521:orcl --username scott --password tiger -m 1 --table MYEMP --export-dir ********
    sqoop export --connect jdbc:mysql://192.168.116.27:3306/test --username root --password root -m 1 --table ddd --export-dir /user/hive/warehouse/emp --input-fields-terminated-by '\001'
    

    相关文章

      网友评论

          本文标题:51cto赵强HADOOP学习(十五)sqoop的导入导出

          本文链接:https://www.haomeiwen.com/subject/hggiwxtx.html