一站式网站建设,连云港网站建设方案,全国建设工程信息网站,重庆 seoSpark安装教程 文章目录 Spark安装教程1. 检查jdk版本2. 获取Spark版本安装资源3.环境变量4.配置文件5. 重启Hadoop集群(使配置生效)6. 启动Spark集群6.1 查看Spark服务6.2 访问Spark WEB UI 7. 启动 Spark-Shell 测试 Scala 交互式环境8. 测试Spark On Yarn9.关闭Spark集群 1.…
Spark安装教程 文章目录 Spark安装教程1. 检查jdk版本2. 获取Spark版本安装资源3.环境变量4.配置文件5. 重启Hadoop集群(使配置生效)6. 启动Spark集群6.1 查看Spark服务6.2 访问Spark WEB UI 7. 启动 Spark-Shell 测试 Scala 交互式环境8. 测试Spark On Yarn9.关闭Spark集群 1. 检查jdk版本
检查jdk是否安装并且版本是否为1.8
javac -version
# javac 1.8.0_1712. 获取Spark版本安装资源
本文以Spark3.1.2为例资源详见文章上方。 https://dlcdn.apache.org/
3.环境变量
vim /etc/profile
export SPARK_HOME/opt/software/spark-3.1.2
export PATH$SPARK_HOME/bin:$PATH4.配置文件
cd $SPARK_HOME/conf
mv spark-env.sh.template spark-env.sh
vim spark-env.sh
------------------------------------------------
export HADOOP_CONF_DIR$HADOOP_HOME/etc/hadoop/
export YARN_CONF_DIR$HADOOP_HOME/etc/hadoop/
------------------------------------------------cd $HADOOP_HOME/etc/hadoop
vim yarn-site.xml
------------------------------------------------
# 添加两个property
propertynameyarn.nodemanager.pmem-check-enabled/namevaluefalse/value
/property
propertynameyarn.nodemanager.vmem-check-enabled/namevaluefalse/value
/property
------------------------------------------------5. 重启Hadoop集群(使配置生效)
stop-all.sh
start-all.sh6. 启动Spark集群
/opt/software/spark-3.1.2/sbin/start-all.sh6.1 查看Spark服务
jps -ml
----------------------------------------------------------------
1649 org.apache.spark.deploy.master.Master --host single --port 7077 --webui-port 8080
1707 org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://single:7077
----------------------------------------------------------------6.2 访问Spark WEB UI
http://single01:8080/
7. 启动 Spark-Shell 测试 Scala 交互式环境
spark-shell --master spark://single:7077
----------------------------------------------------------------
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://single:4040
Spark context available as sc (master spark://single:7077, app id app-20240315091621-0000).
Spark session available as spark.
Welcome to____ __/ __/__ ___ _____/ /___\ \/ _ \/ _ / __/ _//___/ .__/\_,_/_/ /_/\_\ version 3.1.2/_/Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_171)
Type in expressions to have them evaluated.
Type :help for more information.scala _
----------------------------------------------------------------8. 测试Spark On Yarn
spark-shell --master yarn
----------------------------------------------------------------
Spark context Web UI available at http://single:4040
Spark context available as sc (master yarn, app id application_1710465965758_0001).
Spark session available as spark.
Welcome to____ __/ __/__ ___ _____/ /___\ \/ _ \/ _ / __/ _//___/ .__/\_,_/_/ /_/\_\ version 3.1.2/_/Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_171)
Type in expressions to have them evaluated.
Type :help for more information.scala
----------------------------------------------------------------9.关闭Spark集群
/opt/software/spark-3.1.2/sbin/stop-all.sh