网站链接: element-ui dtcms
当前位置: 首页 > 技术博文  > 技术博文

CentOS7 安装Spark

2021/4/29 14:54:29 人评论

CentOS7 安装Spark 准备包:spark-2.3.0-bin-hadoop2.7.tgz 前提:已经安装JDK和Hadoop并且配置了环境变量 tar -zxvf spark-2.3.0-bin-hadoop2.7.tgz -C /home/hadoop/ mv spark-2.3.0-bin-hadoop2.7/ spark tar -zxvf scala-2.12.8.tgz -C /home/hado…

CentOS7 安装Spark

  1. 准备包:spark-2.3.0-bin-hadoop2.7.tgz

前提:已经安装JDK和Hadoop并且配置了环境变量

tar -zxvf spark-2.3.0-bin-hadoop2.7.tgz -C /home/hadoop/
mv spark-2.3.0-bin-hadoop2.7/ spark

tar -zxvf scala-2.12.8.tgz -C /home/hadoop
mv scala-2.12.8/ scala

3.环境变量配置

export SQOOP_HOME=/home/hadoop/sqoop
export SPARK_HOME=/home/hadoop/spark
export SCALA_HOME=/home/hadoop/scala
export PATH= P A T H : PATH: PATH:PIG_HOME/bin: Z K H O M E / b i n : ZK_HOME/bin: ZKHOME/bin:HBASE_HOME/bin: H I V E H O M E / b i n : HIVE_HOME/bin: HIVEHOME/bin:SQOOP_HOME/bin: S P A R K H O M E / b i n : SPARK_HOME/bin: SPARKHOME/bin:SPARK_HOME/sbin:$SCALA_HOME/bin

  1. 修改配置文件
    cp log4j.properties.template log4j.properties
    vi log4j.properties

cp spark-env.sh.template spark-env.sh
vi spark-env.sh
export SPARK_DIST_CLASSPATH= ( / h o m e / h a d o o p / h a d o o p − 2.7.7 / b i n / h a d o o p c l a s s p a t h ) e x p o r t S C A L A H O M E = / h o m e / h a d o o p / s c a l a e x p o r t J A V A H O M E = / h o m e / h a d o o p / j d k 1.8. 0 4 0 e x p o r t H A D O O P H O M E = / h o m e / h a d o o p / h a d o o p − 2.7.7 e x p o r t H A D O O P C O N F D I R = (/home/hadoop/hadoop-2.7.7/bin/hadoop classpath) export SCALA_HOME=/home/hadoop/scala export JAVA_HOME=/home/hadoop/jdk1.8.0_40 export HADOOP_HOME=/home/hadoop/hadoop-2.7.7 export HADOOP_CONF_DIR= (/home/hadoop/hadoop2.7.7/bin/hadoopclasspath)exportSCALAHOME=/home/hadoop/scalaexportJAVAHOME=/home/hadoop/jdk1.8.040exportHADOOPHOME=/home/hadoop/hadoop2.7.7exportHADOOPCONFDIR=HADOOP_HOME/etc/hadoop
export SPARK_MASTER_IP=192.168.56.104
export SPARK_LOCAL_DIRS=/home/hadoop/spark
export SPARK_WORKER_MEMORY=512m
export HIVE_CONF_DIR=/home/hadoop/hive/conf

—Spark与HIVE整合

cp /home/hadoop/hive/lib/mysql-connector-java-5.1.40.jar /home/hadoop/spark/jars/cp hive/conf/hive-site.xml spark/conf/

相关资讯

    暂无相关的数据...

共有条评论 网友评论

验证码: 看不清楚?