1. 程式人生 > >Spark遇到的報錯和坑

Spark遇到的報錯和坑

 

1. Java版本不一致,導致啟動報錯。

# 解決方法:
在啟動指令碼最前邊新增系統引數,指定Java版本
export JAVA_HOME=/usr/java/jdk1.8.0_181-amd64/jre

 

2. Spark1和Spark2並存,啟動時報錯。

# 在SPARK_HOME中指定啟動的spark版本
export SPARK_HOME=/data01/opt/cloudera/parcels/SPARK2-2.3.0.cloudera3-1.cdh5.13.3.p0.458809/lib/spark2

 

3.缺少Hadoop依賴包

Error: A JNI error has occurred, please check your installation and try again
Exception 
in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger at java.lang.Class.getDeclaredMethods0(Native Method) at java.lang.Class.privateGetDeclaredMethods(Class.java:2701) at java.lang.Class.privateGetMethodRecursive(Class.java:3048) at java.lang.Class.getMethod0(Class.java:
3018) at java.lang.Class.getMethod(Class.java:1784) at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544) at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526) Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger at java.net.URLClassLoader.findClass(URLClassLoader.java:
381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
解決方法:
#新增Hadoop的classpath到SPARK_DIST_CLASSPAHT中
export SPARK_DIST_CLASSPATH=$(hadoop classpath)