1. 程式人生 > >Spark叢集搭建的Hive 0.13搭建完整版

Spark叢集搭建的Hive 0.13搭建完整版

一、安裝hive包1、將課程提供的apache-hive-0.13.1-bin.tar.gz使用WinSCP上傳到spark1的/usr/local目錄下。2、解壓縮hive安裝包:tar -zxvf apache-hive-0.13.1-bin.tar.gz。3、重新命名hive目錄:mv apache-hive-0.13.1-bin hive4、配置hive相關的環境變數vi .bashrcexport HIVE_HOME=/usr/local/hiveexport PATH=$HIVE_HOME/binsource .bashrc二、安裝mysql1、在spark1上安裝mysql。2、使用yum安裝mysql server。yum install -y mysql-serverservice mysqld startchkconfig mysqld on3、使用yum安裝mysql connectoryum install -y mysql-connector-java4、將mysql connector拷貝到hive的lib包中cp /usr/share/java/mysql-connector-java-5.1.17.jar /usr/local/hive/lib5、在mysql上建立hive元資料庫,並對hive進行授權create database if not exists hive_metadata;grant all privileges on hive_metadata.* to 'hive'@'%' identified by 'hive';grant all privileges on hive_metadata.* to 'hive'@'localhost' identified by 'hive';grant all privileges on hive_metadata.* to 'hive'@'spark1' identified by 'hive';flush privileges;use hive_metadata;三、配置hive-site.xm
lmv hive-default.xml.template hive-site.xmlvi hive-site.xml<property>  <name>javax.jdo.option.ConnectionURL</name>  <value>jdbc:mysql://spark1:3306/hive_metadata?createDatabaseIfNotExist=true</value></property><property>  <name>javax.jdo.option.ConnectionDriverName</name>  <value>com.mysql.jdbc.Driver</value></property><property>  <name>javax.jdo.option.ConnectionUserName</name>  <value>hive</value></property><property>  <name>javax.jdo.option.ConnectionPassword</name>  <value>hive</value></property><property>  <name>hive.metastore.warehouse.dir</name>  <value>/user/hive/warehouse</value></property>四、配置hive-env.sh和hive-config.sh
mv hive-env.sh.template hive-env.shvi /usr/local/hive/bin/hive-config.shexport JAVA_HOME=/usr/java/latestexport HIVE_HOME=/usr/local/hiveexport HADOOP_HOME=/usr/local/hadoop五、驗證hive是否安裝成功直接輸入hive命令,可以進入hive命令列