1. 程式人生 > >Spark bind on port 0. Attempting port 1 問題解決

Spark bind on port 0. Attempting port 1 問題解決

spark 大數據 hadoop

Linux 下運行spark local bind on port 0. Attempting port 1 問題

2016-11-01 16:04:56 [org.apache.spark.util.Utils]-[WARN] - Service ‘sparkDriver‘ could not bind on port 0. Attempting port 1.

2016-11-01 16:04:56 [org.apache.spark.SparkContext]-[ERROR] - Error initializing SparkContext.

java.net.BindException: Can‘t assign requested address: Service ‘sparkDriver‘ failed after 16 retries! Consider explicitly setting the appropriate port for the service ‘sparkDriver‘ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.


原因是找不到主機對應的ip地址,而不是端口綁定不正確引起的。


在hosts中加入如下代碼


127.0.0.1 hostname


這裏的用戶名根據實際情況來寫,這樣,spark local 就能找到本地對應的ip了,啟動成功




本文出自 “去買大白兔” 博客,轉載請與作者聯系!

Spark bind on port 0. Attempting port 1 問題解決