Spark Worker 啟動報錯:No subfolder can be created in
阿新 • • 發佈:2018-12-31
解決方案寫在前面:在spark-env.sh裡有個引數 SPARK_LOCAL_DIRS,是存放shuffle資料落盤的目錄,這個報錯就是這個目錄不存在導致的。建立目錄重啟worker,再將核數和記憶體均衡一下。
附錄一下報錯:
18/03/29 09:59:01 ERROR Worker: Failed to launch executor app-20180329063203-1549/1642 for Mobius.di.2::bdp-141. [dispatcher-event-loop-21]
java.io.IOException: No subfolder can be created in .
at org.apache.spark.deploy.worker.Worker$$anonfun$receive$1$$anonfun$9.apply(Worker.scala:499)
at org.apache.spark.deploy.worker.Worker$$anonfun$receive$1$$anonfun$9.apply(Worker.scala:484)
at scala.collection.MapLike$class.getOrElse(MapLike.scala:128)
at scala.collection.AbstractMap .getOrElse(Map.scala:59)
at org.apache.spark.deploy.worker.Worker$$anonfun$receive$1.applyOrElse(Worker.scala:484)
at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117)
at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
at org.apache.spark.rpc.netty .Inbox.process(Inbox.scala:101)
at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)