1. 程式人生 > >kafka異常 Received -1 when reading from channel, socket has likely been closed異常

kafka異常 Received -1 when reading from channel, socket has likely been closed異常

創作不易,請勿抄襲,轉載請註明出處。如有疑問,請加微信 wx15151889890,謝謝。
[本文連結:]https://blog.csdn.net/wx740851326/article/details/https://blog.csdn.net/wx740851326/article/details/84032929
寫程式碼的,先前是在程式碼裡直接寫的kafka資訊,後來將kafka的資訊改為從配置檔案讀取了,結果就出錯了。開始懵逼······

18/11/12 09:08:22 ERROR yarn.ApplicationMaster: User class threw exception: org.apache.spark.SparkException: java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
org.apache.spark.SparkException: java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
	at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
	at org.apache.spark.streaming.kafka.KafkaCluster$$anonfun$checkErrors$1.apply(KafkaCluster.scala:366)
	at scala.util.Either.fold(Either.scala:98)
	at org.apache.spark.streaming.kafka.KafkaCluster$.checkErrors(KafkaCluster.scala:365)
	at org.apache.spark.streaming.kafka.KafkaUtils$.getFromOffsets(KafkaUtils.scala:222)
	at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:484)
	at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:607)
	at org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream(KafkaUtils.scala)
	at com.cvc.wj.handle.RealTimeAlarm.main(RealTimeAlarm.java:53)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:688)

看一看報錯資訊,是在建立讀取kafka的流時出現了問題,建立流的程式碼如下

JavaPairInputDStream<String, String> lines = KafkaUtils
		.createDirectStream(jssc, String.class, // key型別
				String.class, // value型別
				StringDecoder.class, // 解碼器
				StringDecoder.class, kafkaParams, topics);

一個是用到了spark,此處的spark只是初始化,不涉及其他,因此可以排除問題。剩下就是kafka 的配置引數和topic資訊了。
然後把topic打出來看,發現是kafka的資訊沒有獲取到,把配置檔案裡的topic配置正確之後,問題解決。
總結下,尤其是像這種從本來是好的狀態到報錯,很可能就是配置資訊的問題,細心檢視kafka的配置項,相信問題不會出現在kafka上。