1. 程式人生 > >spark叢集在執行任務出現nitial job has not accepted any resources; check your cluster UI to ensure that worker

spark叢集在執行任務出現nitial job has not accepted any resources; check your cluster UI to ensure that worker

1 spark叢集在執行任務時出現了:

   

2 原因:這是因為預設分配的記憶體過大(1024M)

3 解決方法:

     在conf/spark-env.sh下新增export  SPARK_WORKER_MEMORY=512和export  SPARK_DAEMON_MEMORY=256