1. 程式人生 > >hadoop-2.4.0原始碼編譯過程

hadoop-2.4.0原始碼編譯過程

系統為ubuntu14.04,32bit,以前一直用官網包(官網為32bit),這次試著自己編譯了一次,大致如下:

1.下載hadoop-2.4.0-src.tar.gz原始碼包

下載完成解壓,得到hadoop原始碼資料夾:hadoop-2.4.0-src

2.安裝編譯所需的軟體:

1).jdk1.7安裝(編譯時切記不要用jdk1.8):

下載之後,解壓安裝至/usr/local/目錄,並配置環境變數:

export JAVA_HOME=/usr/local/jdk1.7.0_79

expoort JRE_HOME=/usr/local/jdk1.7.9_79/jre

export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH

然後退出,source

檢視安裝是否成功:

java -version

2).maven安裝

下載完成後解壓得到apache-maven-3.2.5,安裝至/usr目錄下,並配置環境變數:

export MAVEN_HOME=/usr/apache-maven-3.2.5
export MAVEN_OPTS="-Xms128m -Xmx512m"

export PATH=$MAVEN_HOME/bin:$PATH

然後退出,source

驗證安裝是否成功:

mvn -v

3).Ant 安裝

下載完成後解壓得到apache-ant-1.9.4,安裝至/usr目錄下,配置環境變數:

export ANT_HOME=/usr/apache-ant-1.9.4
export PATH=$ANT_HOME/bin:$PATH

然後退出,source

驗證安裝是否成功:

ant -version

4).g++安裝

  sudo apt-get install g++

5).protobuf安裝(hadoop2.4.0適用protobuf2.5.0版本):

tar xzf protobuf-2.5.0.tar.gz

cd protobuf-2.5.0

 sudo ./configure --prefix=/usr/protobuf(指定的安裝目錄)或者 在所安裝的資料夾中sudo ./configure

 sudo make
 sudo make install

 sudo ldconfig

配置環境變數:

export PROTOC_HOME=/usr/protobuf
export PATH=$PROTOC_HOME/bin:$PATH

export LD_LIBRARY_PATH=/usr/protobuf-2.5.0

然後退出,source

驗證安裝是否成功:

[email protected]:~$ protoc --version
libprotoc 2.5.0
[email protected]:~$ 

6).cmake安裝

 tar xzf cmake-2.8.12.2.tar.gz
 cd cmake-2.8.12.2
 sudo ./bootstrap --prefix=/usr/cmake(指定的安裝目錄)
 sudo make
 sudo make install

配置環境變數:

export CMAKE_HOME=/usr/cmake
export PATH=$CMAKE_HOME/bin:$PATH

7).openssl庫安裝

sudo apt-get install libssl-dev

8).libglib2.0-dev安裝

sudo apt-get install libglib2.0-dev

9).libssl-dev安裝

sudo apt-get install libssl-dev

3.編譯hadoop-2.4.0

cd /hadoop-2.4.0-src

mvn package -Pdist -DskipTests -Dtar  

注意:hadoop-2.4.0-src許可權設定:

sudo chown -R hadoop:hadoop hadoop-2.4.0-src

編譯完成後:

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [  4.940 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  2.429 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  3.136 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.521 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  2.996 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  4.248 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  2.875 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  2.447 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  2.249 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:41 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  5.997 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.085 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [04:11 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 19.371 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 31.596 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  4.239 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.054 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.073 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [01:43 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 52.105 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.114 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [  5.712 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 56.731 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  1.851 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 14.598 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 10.669 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  2.724 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [  3.387 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.290 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  2.537 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  1.601 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.116 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [  9.501 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.075 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 20.997 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 15.518 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  1.601 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [  5.851 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  4.736 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 16.372 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  1.167 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  3.231 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [  5.312 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  8.653 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 23.317 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.464 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  3.259 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  2.664 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  1.524 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  1.646 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  0.035 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  3.909 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  8.041 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.147 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 13.277 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  6.222 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.024 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 39.922 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:00 min
[INFO] Finished at: 2015-04-19T10:52:55+08:00
[INFO] Final Memory: 126M/391M
[INFO] ------------------------------------------------------------------------
[email protected]:~/hadoop-2.4.0-src$ 

在目錄~/hadoop-2.4.0-src/hadoop-dist/target下有檔案:hadoop-2.4.0.tar.gz,就是我們需要的檔案。

當然這次編譯遇到了許多問題,時間原因只記錄了幾個,詳見我的另一部落格:hadoop2.4.0原始碼編譯問題

相關推薦

hadoop-2.4.0原始碼編譯過程

系統為ubuntu14.04,32bit,以前一直用官網包(官網為32bit),這次試著自己編譯了一次,大致如下: 1.下載hadoop-2.4.0-src.tar.gz原始碼包 下載完成解壓,得到hadoop原始碼資料夾:hadoop-2.4.0-src 2.安裝編譯

CentOS 64位上編譯Hadoop 2.6.0原始碼

Hadoop不提供64位編譯好的版本,只能用原始碼自行編譯64位版本。學習一項技術從安裝開始,學習hadoop要從編譯開始。 1.作業系統編譯環境 yum install cmake lzo-devel zlib-devel gcc gcc-c++ autocon

Hadoop 2.4.0和YARN的安裝過程

Hadoop 2.x新特性     將Mapreduce框架升級到Apache YARN,YARN將Map reduce工作區分為兩個:JobTracker元件:實現資源管理和任務JOB;計劃/監視元件:劃分到單獨應用中。 使用MapReduce的2.0,開發人員現在可以直接Hadoop內部基於構建應用程式

初學者CentOS7安裝hadoop-2.8.0叢集詳細過程以及問題解決

一、安裝前準備 VMware-workstation-full-10.0.4 CentOS-7-x86_64-DVD-1804.iso映象 jdk-8u181-linux-x64.tar.gz hadoop-2.8.0.tar.gz 二、安裝過程 虛擬機器設定 (

petaho kettle 8.1.0.0原始碼編譯過程

1 到github上去下載原始碼 https://github.com/pentaho/pentaho-kettle/tree/8.1.0.0 2 下載並配置maven的設定按照管網的指導 3 配置maven的環境變數 (這裡可以在網上百度一下) 4 配置 mav

Hive-0.13.1本地獨立模式安裝 元資料儲存到MariaDB-10.1.0 Hadoop-2.4.0

tar -zxvf apache-hive-0.13.1-bin.tar.gz 解壓後,編輯java.sh(java.sh為自己建立的指令碼): vim /etc/profile.d/java.sh export HIVE_HOME=/opt/modules/hive/apache-hive-0.13.1

hadoop 2.4.0 使用distcp有關問題解決

hadoop distcp hftp://nn.xxx.xx.com:50070/user/nlp/warehouse/t_m_user_key_action    /user/nlp/warehouse/dw1 出現    Caused by: java.io.IOExc

Spark-2.4.0原始碼:sparkContext

  在看sparkContext之前,先回顧一下Scala的語法。Scala建構函式分主構造和輔建構函式,輔建構函式是關鍵字def+this定義的,而類中不在方法體也不在輔建構函式中的程式碼就是主建構函式,例項化物件的時候主建構函式都會被執行,例:    class person(name Strin

基於CentOS6.4環境編譯Spark-2.1.0原始碼

基於CentOS6.4環境編譯Spark-2.1.0原始碼   1 寫在前面的話 有些小夥伴可能會問:Spark官網不是已經提供了Spark針對不同版本的安裝包了嗎,我們為什麼還需要對Spark原始碼進行編譯呢?針對這個問題我們到Spark官網: spark.a

Spark 2.3.2原始碼編譯,支援hadoop-2.6.0-cdh5.15.0

前置準備&軟體安裝 spark2.3.2原始碼官方Apache下載地址: http://spark.apache.org/downloads.html 編譯spark原始碼的官方Apache參考文件 http://spark.apache.org/docs/2.3.2/b

win7通過原始碼編譯hadoop-2.7.0

      編譯hadoop原始碼,意義在於當我們使用eclipse進行hadoop開發時,可以直接在本地執行,而無需打包成jar,然後再提交到hadoop伺服器進行執行。當然,這還需要一個可以支援hadoop對應版本的eclipse外掛,即hadoop-eclipse-2.

Spark 2.2原始碼編譯 & 支援hadoop-2.6.0-cdh5.7.0

JDK & Maven & Scala & Git軟體安裝 & 前置準備 編譯Spark原始碼的前置要求: Maven 3.3.9 or newer Java 8+ Scala Git(後文會通過分析make-d

Ubuntu20.04linux核心(5.4.0版本)編譯準備與實現過程-編譯過程2

   前面因為部落格園維修,所以核心編譯過程一直沒有發出來,現在把整個核心過程分享出來。本隨筆給出核心的編譯實現過程,在編譯前需要參照我前面一篇隨筆: Ubuntu20.04linux核心(5.4.0版本)編譯準備與實現過程-編譯前準備(1) :https://www.cnblogs.co

Spark Streaming實時流處理筆記(1)——Spark-2.2.0原始碼編譯

1 下載原始碼 https://spark.apache.org/downloads.html 解壓 2 編譯原始碼 參考 https://www.imooc.com/article/18419 https://spark.apache.org/docs/2.2.2/bu

保姆級教程——Ubuntu16.04 Server下深度學習環境搭建:安裝CUDA8.0,cuDNN6.0,Bazel0.5.4原始碼編譯安裝TensorFlow1.4.0(GPU版)

寫在前面 本文敘述了在Ubuntu16.04 Server下安裝CUDA8.0,cuDNN6.0以及原始碼編譯安裝TensorFlow1.4.0(GPU版)的親身經歷,包括遇到的問題及解決辦法,也有一些自己的經驗,希望能對讀者有所幫助。期間參考了許多前人的文章,後文會一一附上鍊接,在此先行謝過。在下能力有限,

spark2.4 整合 hadoop2.6.0-cdh5.7.0 原始碼編譯

1.前置要求 java 8 + maven 3.5.4 + scala 2.11 2.下載 spark2.4 原始碼包 在spark官網 下載頁面中選擇對應的spark版本和原始碼包 [[email protected] softwore

spark2.2.0 原始碼編譯安裝

1. Spark概述     Spark 是一個用來實現快速而通用的叢集計算的平臺。     在速度方面,Spark 擴充套件了廣泛使用的 MapReduce 計算模型,而且高效地支援更多計算模式,包括互動式查詢和流處理。 在處理大規模資料集時,速度是非常重要的。速度快就意

Spark-2.2.0原始碼編譯報錯

[INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] -----------------------------------------------

mono-3.4.0 原始碼安裝時出現的問題 [do-install] Error 2 [install-pcl-targets] Error 1 解決方法

/usr/bin/install: cannot stat `targets/Microsoft.Portable.Common.targets': No such file or directory make[7]: *** [install-pcl-targets] Error 1 make[7]

apache hadoop-2.6.0-CDH5.4.1 安裝

apache hadoop-2.6.0-CDH5.4.1 安裝 1.安裝Oracle Java 8 sudo add-apt-repository ppa:webupd8team/java sudo apt-get update sudo apt-get install or