SpringCloud基础框架搭建 –8 集成spark相关应用-环境

这是系列搭建springcloud基础框架的文章,内容包括集成shiro、Mysql主从、seata、activiti、drools、hadoop大数据常用组件、keepalive+nginx https配置等;

由于之前的镜像映射端口只有 50070 而且没有 目录映射;
先调整下运行容器的 映射端口及目录:
参考:运行容器添加映射端口或目录.note
========================================
安装参考:
https://www.dandelioncloud.cn/article/details/1512971398741049346 (比较详细)
https://www.ngui.cc/el/1228045.html (scp -r 拷贝 spark及 scala 到 slavex)
安装 scala:
wget “https://downloads.lightbend.com/scala/2.11.12/scala-2.11.8.tgz
用上面的方式直接在 centos 下载到 /usr/local/softs
这样,在docker master:/root/bigdatas/softs 可以看到scala
我是从apple的chrome 下载,再scp到 vmware centos 上;
=====
进入 master :
cd /root/bigdatas/softs
tar -xzvf scala-2.11.12.tgz -C /usr/local
解决后
vi ~/.bashrc

export SCALA_HOME=/usr/local/scala-2.11.12
export PATH=$SCALA_HOME/bin:$PATH

source 后,查看 版本 scala -version
====
安装 spark
cd /root/bigdatas/softs
tar -xzvf spark-2.3.0-bin-hadoop2.7.tgz -C /usr/local/
mv spark-2.3.0-bin-hadoop2.7 spark-2.3.0
vi ~/.bashrc

export SPARK_HOME=/usr/local/spark-2.3.0
export PATH=$SPARK_HOME/bin:$PATH

vi {spard}/conf/spark_env.sh
mac 上 按住 shift+g 滚到到底部输入:

export JAVA_HOME=/usr/local/java/jdk1.8.0_141/
export HADOOP_HOME=/usr/local/hadoop/etc/hadoop/
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop/
export YARN_CONF_DIR=/usr/local/hadoop/etc/hadoop/
export SPARK_MASTER_IP=master
export SPARK_EXECUTOR_MEMORY=1G
export SCALA_HOME=/usr/local/scala-2.11.12/

在 conf/savles 文输入两行
slave1
slave2
============
将 spark及scala目录,拷贝到 slave1,2;
cd /usr/local
scp -r spark_xxx scale_xxxx slave1:/usr/local/
slave2一样同步;
==========
在 slave1,2 上 添加 ~/.bashrc 的 spark, scala 环境变量,并 source;
回到 Master:{spark}/sbin/start-all.sh
有错误 :failed to launch: nice -n 0 spark xxxx (忽略);

[root@localhost /]# docker ps -a
CONTAINER ID        IMAGE               COMMAND             CREATED             STATUS              PORTS                                              NAMES
22476a547e77        bigwork:hadoop-v1   "/sbin/init"        8 hours ago         Up About an hour    0.0.0.0:50070->50070/tcp, 0.0.0.0:7011->8080/tcp   master
d0711b3e4886        bigwork:hadoop-v1   "/sbin/init"        10 hours ago        Up About an hour    50070/tcp                                          slave2
0885beee91fc        bigwork:hadoop-v1   "/sbin/init"        10 hours ago        Up About an hour    50070/tcp  

在浏览器查看 :http://172.16.59.128:7011
重启spark:

slave1: stopping org.apache.spark.deploy.worker.Worker
slave2: stopping org.apache.spark.deploy.worker.Worker
stopping org.apache.spark.deploy.master.Master

但是 8080 所指向的 7077 无法访问;
logs/ 下日志提示:Utils:66 – Service ‘MasterUI’ could not bind on port 8080. Attempting port 8081
查看 8080 被占
lsof -i:8080
yum install -y lsof
==============
提示:java.lang.NoClassDefFoundError: com/alibaba/druid/pool/DruidDataSourceFactory
拷贝 /maven/repository/com/alibaba/druid/1.1.16/druid-1.1.16.jar 到 spark/jars

提示: java.sql.SQLException: com.mysql.jdbc.Driver
拷贝 /repository/mysql/mysql-connector-java/5.1.18mysql-connector-java-5.1.18.jar 到 spark/jars

提示:java.lang.NoClassDefFoundError: org/apache/commons/pool2/impl/GenericObjectPoolConfig
拷贝 org/apache/commons/pools/commons-pool2-2.4.2.jar 到 spark/jars

提示 java.lang.NoClassDefFoundError: redis/clients/jedis/JedisPool
拷贝 /redis/clients/jedis/2.9.0/ jedis-2.9.0.jar 到 到 spark/jars

========================
其他;
参考:https://blog.csdn.net/m0_48639280/article/details/128472177
spark 安装配置及日志;

欢迎您的到来,感谢您的支持!

为您推荐

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注