1. 程式人生 > >spark-shell啟動報錯解決辦法

spark-shell啟動報錯解決辦法

spark-shell啟動報錯解決辦法:
scala版本不相容問題
這是因為加入了專案依賴庫到/usr/cwgis/app/spark/jars/lib/中
刪除相關的scala開頭的jar檔案即可啟動spark-shell

[[email protected] ~]# runCmd.sh "rm /usr/cwgis/app/spark/jars/lib/scala*.jar" all

錯誤: 找不到或無法載入主類 org.apache.spark.deploy.yarn.ExecutorLauncher

SparkException: Yarn application has already ended! It might have been killed or unable to launch application master
spark1.0版本

[[email protected] spark-1.0.1-bin-hadoop2]$<span style="color:#FF0000;"> export SPARK_JAR=lib/spark-assembly-1.0.1-hadoop2.2.0.jar 

spark other version

建立壓縮jar檔案方法:
生成一個spark-libs.jar檔案,打包/spark/jars目錄下所有jar檔案和子目錄jar檔案

jar cv0f spark-libs.jar -C /usr/cwgis/app/spark/jars/ .

在spark-default.conf中設定 spark.yarn.archive=hdfs://mycluster:8020/spark/spark-libs.jar
或者 spark.yarn.jars=hdfs://mycluster:8020/spark/spark-libs.jar