1. 程式人生 > >大資料篇:Spark 啟動時,提示 slave1 JAVA_HOME not set

大資料篇:Spark 啟動時,提示 slave1 JAVA_HOME not set

Problem:

[[email protected] ~]# start-slaves.sh
slave1: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark-1.6.3-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave1.out
slave2: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark-1.6.3-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave2.out
slave2: failed to launch org.apache.spark.deploy.worker.Worker:
slave1: failed to launch org.apache.spark.deploy.worker.Worker:
slave1:   JAVA_HOME is not set
slave1: full log in /usr/local/spark-1.6.3-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave1.out
slave2:   JAVA_HOME is not set
slave2: full log in /usr/local/spark-1.6.3-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave2.out

Solve:

切入對應的slave的spark根目錄,在sbin目錄下的spark-config.sh 中新增對應的jdk路徑!

##我的spark根目錄spark-1.6.3-bin-hadoop2.6
cd /usr/local/spark-1.6.3-bin-hadoop2.6/sbin
vi spark-config.sh

## JAVA 新增java_home路徑
export JAVA_HOME=/usr/local/jdk1.8
export PATH=$PATH:$JAVA_HOME/bin