1. 程式人生 > >單機執行Spark Shell遇到的一個低階錯誤

單機執行Spark Shell遇到的一個低階錯誤

bin/spark-shell

下載spark-2.1.0-bin-hadoop2.7.tgz,解壓縮直接進入spark根目錄,然後執行bin/spark-shell即可進入。
但是今天遇到了一個低階錯誤:
java.net.BindException: Cannot assign requested address: Service ‘sparkDriver’ failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service ‘sparkDriver’ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.

[[email protected] spark-2.1.0-bin-hadoop2.7]# bin/spark-shell 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/04/07 22:33:37 WARN NativeCodeLoader: Unable to
load native-hadoop library for your platform... using builtin-java classes where applicable 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not
bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1. 17/04/07 22:33:38 ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries. at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127) at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501) at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218) at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506) at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491) at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965) at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210) at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) at java.lang.Thread.run(Thread.java:745) java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries. at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127) at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501) at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218) at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506) at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491) at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965) at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210) at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) at java.lang.Thread.run(Thread.java:745) <console>:14: error: not found: value spark import spark.implicits._ ^ <console>:14: error: not found: value spark import spark.sql ^ Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.1.0 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112) Type in expressions to have them evaluated. Type :help for more information. scala>

問題原因

[root@sk1 ~]# ifconfig
ens32: flags=4163<UP,BROADCAST,RUNNING,MULTICAST>  mtu 1500
        inet 192.168.11.138  netmask 255.255.255.0  broadcast 192.168.11.255
        inet6 fe80::a8bd:a097:8ca9:d22a  prefixlen 64  scopeid 0x20<link>
        ether 00:0c:29:c3:3e:9a  txqueuelen 1000  (Ethernet)
        RX packets 273939  bytes 395373188 (377.0 MiB)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 16657  bytes 2472671 (2.3 MiB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

lo: flags=73<UP,LOOPBACK,RUNNING>  mtu 65536
        inet 127.0.0.1  netmask 255.0.0.0
        inet6 ::1  prefixlen 128  scopeid 0x10<host>
        loop  txqueuelen 1  (Local Loopback)
        RX packets 276  bytes 23980 (23.4 KiB)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 276  bytes 23980 (23.4 KiB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

本地IP是192.168.11.138

[[email protected] ~]# cat /etc/hosts
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.1.138   sk1

很顯然,IP配置錯了。改正即可

[[email protected] ~]# vi /etc/hosts
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4
::1         localhost localhost.localdomain localhost6 localhost6.localdomain6
192.168.11.138  sk1

重新進入

[[email protected] spark-2.1.0-bin-hadoop2.7]# bin/spark-shell 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/04/07 22:41:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/04/07 22:41:32 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
17/04/07 22:41:33 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
17/04/07 22:41:34 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Spark context Web UI available at http://192.168.11.138:4040
Spark context available as 'sc' (master = local[*], app id = local-1491619281633).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_112)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 

相關推薦

單機執行Spark Shell遇到的一個低階錯誤

bin/spark-shell 下載spark-2.1.0-bin-hadoop2.7.tgz,解壓縮直接進入spark根目錄,然後執行bin/spark-shell即可進入。 但是今天遇到了一個低階錯誤: java.net.BindException:

在Yarn上執行spark-shellspark-sql命令列

spark-shell On Yarn 如果你已經有一個正常執行的Hadoop Yarn環境,那麼只需要下載相應版本的Spark,解壓之後做為Spark客戶端即可。 需要配置Yarn的配置檔案目錄,export HADOOP_CONF_DIR=/etc/hadoop/conf &n

執行spark-shell報錯:

執行spark-shell報錯: [ERROR] Terminal initialization failed; falling back to unsupportedjava.lang.NoClassDefFoundError: Could not initialize class scala.tools

在建立帶輸出引數和返回值的儲存過程時---犯下的一個低階錯誤

錯誤如圖,怎麼執行都沒有自己想要的效果(return掉了,還有個啥???!!!) 處理後: if exists(select * from sysobjects where name='usp_AllPmsTest') drop proc usp_AllPmsTest go c

我暈,一個低階錯誤導致我DEBUG兩天(std::string c_str()的問題)

起因是這樣的,為了方便讀取RO裡的素材,我在OPenRO里加入了一個第三方庫,他的作用主要就是負責提取RO素材資料,並把他們放在heap裡,程式退出他會自動釋放。 但是莫名其妙的問題隨之而來了:每次程式退出都會彈窗提示:“******,其原因可能是堆被損壞,這也說明****

今天遇到了一個低階錯誤!java.lang.reflect.invocationtargetexception 錯誤問題

我是利用反射寫的,其中在一個servlet頁面從JSP頁面中獲取引數時我寫錯了,我的JSP頁面的form表單大致如下:<form>    <input type = "text" name = "customerName" value = "he">&

解決MFC下執行緒建立的一個編譯錯誤

錯誤的資訊為:error C2665: 'AfxBeginThread' : none of the 2 overloads can convert parameter 1 from type 'unsigned int (void *)' 今天在公司用winsdk寫了個執

一個低階錯誤折磨了兩天

正確的web.xml應該是這樣的:<?xml version="1.0" encoding="ISO-8859-1"?><web-app>  <servlet>        <servlet-name>HelloWorld&l

一個低階錯誤引發Netty編碼解碼中文異常

前言 最近在調研Netty的使用,在編寫編碼解碼模組的時候遇到了一箇中文字串編碼和解碼異常的情況,後來發現是筆者犯了個低階錯誤。這裡做一個小小的回顧。 錯誤重現 在設計Netty的自定義協議的時候,發現了字串型別的屬性,一旦出現中文就會出現解碼異常的現象,這個異常並不一定出現了Exception,而是出現瞭解

因為我的一個低階錯誤,生產資料庫崩潰了將近半個小時

#### 前言 halo,相信大家一定過了一個很開心的端午節吧,我看朋友圈裡各種晒旅遊,晒美食的,真是羨慕啊,不像我,感冒了只能在家擼文章。 當然,玩的多開心,節後上班就有多鬱悶,假日綜合徵可不是說說而已。對此我想表達的是,沒事,不用鬱悶,來看我如何自爆家醜來讓你們開心下。 #### 反常的sql語句 上週

Centos 執行shell命令返回127錯誤

建數據庫 mysq data 存在 思路 功能 自動創建 運行 用戶 shell腳本功能:連接mysql,自動創建數據庫,腳本如下 mysql -h$MYSQL_IP -u$MYSQL_USER -p$MYSQL_PASSWORD --default-character-s

php 執行shell命令 打印錯誤信息

信息 資源 null function $cmd = "rm 1.txt";//刪除一個不存在的文件,查看報錯信息 $res = doShell($cmd); var_dump($res); //該函數沒有限制條件,可以直接放在代碼中使用 function doShell($cmd,$cwd=

Spark-Sql整合hive,在spark-sql命令和spark-shell命令下執行sql命令和整合調用hive

type with hql lac 命令 val driver spark集群 string 1.安裝Hive 如果想創建一個數據庫用戶,並且為數據庫賦值權限,可以參考:http://blog.csdn.net/tototuzuoquan/article/details/5

shell一個簡易計算器,可以實現加、減、乘、除運算,假如腳本名字為1.sh,執行示例:./1.

a-z 依次 腳本 als 示例 內置 數位 特殊字符 使用 用shell寫一個簡易計算器,可以實現加、減、乘、除運算,假如腳本名字為1.sh,執行示例:./1.sh 1 + 2#!/bin/bash if [ $# -ne 3 ] then echo "參

執行HBase shell時出現ERROR: org.apache.hadoop.hbase.ipc.ServerNotRunningYetException: Server is not running yet錯誤解決辦法(圖文詳解)

cep ESS 關註 align comm util code ade dap   不多說,直接上幹貨! [kfk@bigdata-pro01 bin]$ jps 1968 NameNode 2385 ResourceManager 2259 Jou

spark-shell啟動錯誤

HR driver 退出 HA invoke OS ctu adc default 18/06/24 16:41:40 ERROR spark.SparkContext: Error initializing SparkContext.java.net.BindExcept

Crontab執行java/spark-shell/spark-submit 異常解決方法

現象: java/spark-shell/spark-submit 語句在linux shell中直接執行時沒有任何問題,但是放到crontab中就出異常,且異常一般都拋在一些基礎庫裡,讓人感覺非常莫名,比如這種: Traceback (most recent call last): &

Spark-Sql整合hive,在spark-sql命令和spark-shell命令下執行sql命令和整合呼叫hive

分享一下我老師大神的人工智慧教程!零基礎,通俗易懂!http://blog.csdn.net/jiangjunshow 也歡迎大家轉載本篇文章。分享知識,造福人民,實現我們中華民族偉大復興!        

一個Verilog一段式狀態機的低階錯誤

問題是這樣的,我想在某個狀態state1下根據輸入a改變輸出b的值,但是在晶片裡執行的時候發現只要狀態維持在state1下,b的值始終是從上一個狀態跳到state1時的值,只有在state1變化的邊沿

一個vue-resource請求的低階錯誤

對於初學的小菜雞,經常會犯一些低階錯誤。 現在記錄一下我在使用vue-resource傳送post請求時的一個低階錯誤: window.BaseURL = '127.0.0.1:8888'; 8888是訪問在本機的後臺程式的埠 請求程式碼如下,   1