1. 程式人生 > >Spark-shell 報錯:Failed to get database default, returning NoSuchObjectException

Spark-shell 報錯:Failed to get database default, returning NoSuchObjectException

Spark-shell 執行sql查詢報錯: 20/08/24 15:33:59 WARN metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException 20/08/24 15:34:03 ERROR metastore.RetryingHMSHandler: AlreadyExistsException(message:Database default already exists) 在百度搜了很多解決方案,一點用沒有,fq 找到了解決辦法: 這個是因為沒有在Spark配置hive的配置檔案,把hive配置的hive-site.xml 檔案拷貝到 spark 目錄下即可 ``` > mv hive/conf/hive-site.xml spark/conf/hive-site.xml ``` 這樣就解決了。 另外一個問題: 報錯: ``` Caused by: java.lang.reflect.InvocationTargetException: org.datanucleus.exceptions.NucleusException: Attempt toinvoke the "BONECP" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver. ``` 這個問題從字面意思上很容易看到,是因為jdbc的driver連線不上,需要依賴 mysql-connector 的jar包。 mysql-connector 這個jar包在 hive 的lib 目錄下,spark 存放jia包的路徑是在 jars目錄下, 所以執行: ``` mv hive/lib/mysql-connector-java-5.1.44-bin.jar spark/jars/mysql-connector-java-5.1.44-bin.jar ``` 這樣就可以了,然後再重啟spark-shell,就可以正常查詢hive裡面的