1. 程式人生 > >Hadoop2.8.3版本編譯

Hadoop2.8.3版本編譯

概述

通過原始碼方式maven編譯獲取(本實驗使用原始碼編譯方式)

1. 安裝前準備

1.1 下載HADOOP原始碼編譯軟體包與依賴包

下載地址

[[email protected] ~]# cd /tmp/
[[email protected] tmp]# wget https://github.com/apache/hadoop/archive/rel/release-2.8.3.tar.gz
[[email protected] tmp]# wget wget ftp://ftcp.netbsd.org/pub/pkgsrc/distfiles/protobuf-2.5.0.tar.gz
[[email protected] tmp]# wget --no-check-certificate https://sourceforge.net/projects/findbugs/files/findbugs/1.3.9/findbugs-1.3.9.tar.gz/download -O findbugs-1.3.9.tar.gz

1.2 Oracle jdk1.8安裝部署(Open jdk儘量不要使用)

下載地址

[[email protected] tmp]# yum install -y lszrz
[[email protected] tmp]# ls
jdk-8u151-linux-x64.tar.gz release-2.8.3.tar.gz ####建立目錄CDH java環境預設讀取此目錄 [[email protected] tmp]# mkdir /usr/java [[email protected] tmp]# tar xf jdk-8u151-linux-x64.tar.gz -C /usr/java/ [[email protected] tmp]# ln -s /usr/java/jdk1.8.0_151/ /usr/java/jdk [[email protected] tmp]# vim /etc/profile
####設定全域性環境變數 [[email protected] local]# vim /etc/profile JAVA_HOME=/usr/java/jdk PATH=$JAVA_HOME/bin::$PATH CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar export JAVA_HOME export PATH export CLASSPATH [[email protected] tmp]# source /etc/profile ####驗證環境變數 [[email protected] ~]# which mvn /usr/local/apache-maven/bin/mvn

1.3 maven安裝部署(版本3.3.9)

下載地址

[[email protected] tmp]# wget https://github.com/apache/hadoop/archive/rel/release-2.8.3.tar.gz
[[email protected] tmp]# tar xf apache-maven-3.5.2-bin.tar.gz -C /usr/local
[[email protected] tmp]# cd /usr/local/
[[email protected] local]# ln -s apache-maven-3.5.2/ apache-maven

####設定全域性環境變數
[[email protected] local]# vim /etc/profile

####CDH jvm環境變數預設目錄/usr/java
JAVA_HOME=/usr/java/jdk
MAVEN_HOME=/usr/local/apache-maven
PATH=$JAVA_HOME/bin:$MAVEN_HOME/bin:$PATH
CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

export JAVA_HOME
export MAVEN_HOME
export PATH
export CLASSPATH
[[email protected] tmp]# source /etc/profile

####驗證環境變數
[[email protected] ~]# which java
/usr/java/jdk/bin/java


####使用當時最新版本maven 3.5.2編譯會報錯

2. HADOOP安裝部署

2.1 配置hadoop使用者與使用者組

[[email protected] ~]# useradd -u 515 -m  hadoop -s /bin/bash

2.2 解壓hadoop原始碼編譯

參考地址

[[email protected] ~]# yum install screen -y
[[email protected] ~]# cd /tmp/
[[email protected] tmp]# yum install -y cmake gcc gcc-c++

####protobuf安裝
[[email protected] tmp]# tar xf protobuf-2.5.0.tar.gz
[[email protected] tmp]# cd protobuf-2.5.0
[[email protected] protobuf-2.5.0]# ./configure && make && make install

####Findbugs安裝
[[email protected] tmp]# tar xf findbugs-1.3.9.tar.gz
[[email protected] tmp]# mv findbugs-1.3.9 /usr/local/
[[email protected] tmp]# mv findbugs-1.3.9 /usr/local/
[[email protected] tmp]# vim /etc/profile
JAVA_HOME=/usr/java/jdk
MAVEN_HOME=/usr/local/apache-maven
FINDBUGS_HOME=/usr/local/findbugs
HADOOP_HOME=/usr/local/hadoop
PATH=$JAVA_HOME/bin:$MAVEN_HOME/bin:$FINDBUGS_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH
CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

export JAVA_HOME
export MAVEN_HOME
export FINDBUGS_HOME 
export FORREST_HOME
export HADOOP_HOME
export PATH
export CLASSPATH
[[email protected] ~]# source /etc/profile

####forrest安裝
[[email protected] tmp]# tar xf apache-forrest-0.9-sources.tar.gz
[[email protected] tmp]# tar xf apache-forrest-0.9-dependencies.tar.gz
[[email protected] tmp]# cp -aPr apache-forrest-0.9 /usr/local/
[[email protected] tmp]# ln -s /usr/local/apache-forrest-0.9 /usr/local/apache-forrest
[[email protected] tmp]# vim /etc/profile
JAVA_HOME=/usr/java/jdk
MAVEN_HOME=/usr/local/apache-maven
FINDBUGS_HOME=/usr/local/findbugs
FORREST_HOME=/usr/local/apache-forrest
HADOOP_HOME=/usr/local/hadoop
PATH=$JAVA_HOME/bin:$MAVEN_HOME/bin:$FINDBUGS_HOME/bin:$FORREST_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH
CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

export JAVA_HOME
export MAVEN_HOME
export FINDBUGS_HOME 
export FORREST_HOME
export HADOOP_HOME
export PATH
export CLASSPATH
[[email protected] ~]# source /etc/profile

####其他依賴
yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool
yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake
yum install -y ant patch

[[email protected] tmp]# tar xf release-2.8.3.tar.gz 
[[email protected] tmp]# cd hadoop-rel-release-2.8.3/

####hadoop原始碼編譯參考文件
[[email protected] hadoop-rel-release-2.8.3]# cat BUILDING.txt
Build instructions for Hadoop

----------------------------------------------------------------------------------
Requirements:

* Unix System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)

Building distributions:

Create binary distribution without native code and without documentation:
  $ mvn package -Pdist -DskipTests -Dtar -Dmaven.javadoc.skip=true

Create binary distribution with native code and with documentation:
  $ mvn package -Pdist,native,docs -DskipTests -Dtar

Create source distribution:
  $ mvn package -Psrc -DskipTests

Create source and binary distributions with native code and documentation:
  $ mvn package -Pdist,native,docs,src -DskipTests -Dtar

Create a local staging version of the website (in /tmp/hadoop-site)
  $ mvn clean site -Preleasedocs; mvn site:stage -DstagingDirectory=/tmp/hadoop-site



[[email protected] hadoop-rel-release-2.8.3]# screen -S hadoop-complie
[[email protected] hadoop-rel-release-2.8.3]# source /etc/profile
[[email protected] hadoop-rel-release-2.8.3]# mvn clean package -Pdist,native -DskipTests -Dtar

Ctrl+a+b 暫時斷開screen會話
[[email protected] hadoop-rel-release-2.8.3]# screen -list
There is a screen on:
    29028.hadoop-complie    (Detached)


     [exec] Hadoop dist tar available at: /tmp/hadoop-rel-release-2.8.3/hadoop-dist/target/hadoop-2.8.3.tar.gz
     [exec] 
[INFO] Executed tasks
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [ 16.396 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  0.635 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.771 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  2.545 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.190 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.394 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  3.957 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  4.994 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  5.381 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  3.455 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:22 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  5.536 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 25.214 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.046 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [01:15 min]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [01:12 min]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [  9.247 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 26.437 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 13.187 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  3.630 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.035 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.034 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 17.175 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [01:23 min]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.040 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [  6.368 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 15.875 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  3.191 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [ 21.611 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 23.891 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  1.207 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [  5.441 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  3.466 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  3.128 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.030 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  2.742 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  1.930 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.050 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  4.734 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [  5.174 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.141 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 24.020 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 17.155 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  3.572 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [  9.888 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [  5.123 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [ 10.669 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  2.302 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  5.157 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [  3.424 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  7.017 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  5.061 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  2.145 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  2.210 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  5.128 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  4.210 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.330 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.208 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  3.015 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  8.250 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  4.430 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 17.715 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [  7.833 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  5.975 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.898 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  4.790 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 12.119 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  6.030 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.043 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 34.428 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:53 min
[INFO] Finished at: 2017-12-22T17:39:10+08:00
[INFO] Final Memory: 241M/592M
[INFO] ------------------------------------------------------------------------

real    12m55.948s
user    19m56.246s
sys     1m13.708s


####編譯成功tar包
[[email protected] hadoop-rel-release-2.8.3]# cd hadoop-dist/target/
[[email protected] target]# ls hadoop-2.8.3.tar.gz 
hadoop-2.8.3.tar.gz

提醒:
1、有時候編譯過程中會出現下載某個包的時間太久,這是由於連線網站的過程中會出現假死,
此時按ctrl+c,重新執行編譯命令。
2、如果出現缺少了某個檔案的情況,則要先清理maven(使用命令 mvn clean) 再重新編譯。

參考地址

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project hadoop-common: An Ant BuildException has occured: stylesheet /tmp/hadoop-rel-release-2.8.3/hadoop-common-project/hadoop-common/${env.FINDBUGS_HOME}/src/xsl/default.xsl doesn't exist.
[ERROR] around Ant part ...<xslt in="/tmp/hadoop-rel-release-2.8.3/hadoop-common-project/hadoop-common/target/findbugsXml.xml" style="${env.FINDBUGS_HOME}/src/xsl/default.xsl" out="/tmp/hadoop-rel-release-2.8.3/hadoop-common-project/hadoop-common/target/site/findbugs.html"/>... @ 33:251 in /tmp/hadoop-rel-release-2.8.3/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common



編譯過程中碰到的問題:
[ERROR] Failed to execute goal on project hadoop-common: Could not resolve dependencies for project org.apache.hadoop:hadoop-common:jar:2.7.1: Could not transfer artifact org.apache.commons:commons-math3:jar:3.1.1 from/to nexus-osc (http://maven.oschina.net/content/groups/public/): GET request of: org/apache/commons/commons-math3/3.1.1/commons-math3-3.1.1.jar from nexus-osc failed: Premature end of Content-Length delimited message body (expected: 1599627; received: 866169 -> [Help 1]

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.7.1:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: ‘protoc –version’ did not return a version -> [Help 1]

缺這缺那的,用thrift編譯說明提到的一個把開發工具全裝上。

yum -y groupinstall “Development Tools”

需要安裝ant, yum install ant

Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program “cmake” (in directory “/root/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/native”): error=2, No such file or directory

需要安裝 findbugs  

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project hadoop-common: An Ant BuildException has occured: stylesheet /home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/${env.FINDBUGS_HOME}/src/xsl/default.xsl doesn’t exist.

[ERROR] around Ant part …<xslt in=”/home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/findbugsXml.xml” style=”${env.FINDBUGS_HOME}/src/xsl/default.xsl” out=”/home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/site/findbugs.html”/>… @ 43:251 in /home/hadoop/hadoop-2.7.1-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml

然後設定環境變數  export FINDBUGS_HOME=/usr/local/findbugs-3.0.0

需要安裝cmake

Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on
project hadoop-pipes: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part …<exec dir=”/home/pory/workplace/hadoop-2.4.1-src/hadoop-
tools/hadoop-pipes/target/native” executable=”cmake” failonerror=”true”>… @ 5:131 in
/home/pory/workplace/hadoop-2.4.1-src/hadoop-tools/hadoop-pipes/target/antrun/build-
main.xml

相關推薦

Hadoop2.8.3版本編譯

概述 通過原始碼方式maven編譯獲取(本實驗使用原始碼編譯方式) 1. 安裝前準備 1.1 下載HADOOP原始碼編譯軟體包與依賴包 下載地址 [[email protected] ~]# cd /tmp/ [[email

Qt5.8以上版本編譯Oracle數據庫的OCI驅動教程

+= pro 位置 htm 失效 director ref 們的 www 在前一篇的文章中我已經發過一個相似的文章,詳情請點擊:Qt5編譯oracle驅動教程。 在那一篇文章中已經可以解決了Qt5的常用版本的Oracle數據庫驅動的支持,但是在新的Qt開發工具中那種方法竟然

OSG 3.6.3 版本編譯一些問題

   編譯OSG最新版本3.6.3,本以為沒什麼問題(因為曾經便已過N多次),但還是遇到些棘手問題,在這裡做一總結。   1. 編譯出現 osg庫正常編譯,osgDB編譯提示找不到glColor4fv、 glLoadMatrix等基礎函式。 查明原因:gl.

Android studio 3版本編譯Gradle3.0 JNI的.so工程

Android Studio 3版本以上已經不再支援android.useDeprecatedNdk,這是我看過的比較細緻的一篇文章,文章原址: 一、最近更新Android studio 到3.0 版本,發現編寫jni 時,報錯了,錯誤如下: Error:Execut

hadoop2.8.3 + hive2.3.3 + hbase0.98.24 + spark2.1.3安裝

安裝虛擬機器 安裝檔案:CentOS-6.6-x86_64-bin-DVD1.iso 自定義硬體 記憶體增加到8G,處理器修改為4核 自動安裝 永久修改hostname vim /

Linux中基於hadoop安裝hive(RHEL7.0+hadoop2.8.3+hive2.3.2)

2安裝          3.3測試        關鍵字:Linux  Java  CentOS  Hadoop  Hive         說明:安裝hive前提是要先安裝hadoop叢集,並且hive只需要再hadoop的namenode節點叢集裡安裝即可(需要再所有n

Qt5.8以上版本編譯Oracle資料庫的OCI驅動教程

在前一篇的文章中我已經發過一個相似的文章,詳情請點選:Qt5編譯oracle驅動教程。 在那一篇文章中已經可以解決了Qt5的常用版本的Oracle資料庫驅動的支援,但是在新的Qt開發工具中那種方法竟然失效了,具體出現在Qt5.7.1之後的版本。 這次我帶來Qt5.8.0的編

Hadoop2.6.3版本叢集搭建

一、環境說明 1.機器:三臺虛擬機器 2.Linux版本 cat /proc/version Linux version 3.19.0-25-generic ([email protected]) (gcc version 4.8.2 (U

hadoop2.8.4 版本yarn RM fairScheduler排程效能優化的嘗試

對一般小公司來說 可能yarn排程能力足夠了 但是對於大規模叢集1000 or 2000+的話  yarn的排程效能捉襟見肘 恰好網上看到一篇很好的文章https://tech.meituan.com/2019/08/01/hadoop-yarn-scheduling-performance-opt

linux交叉編譯gcc4.8.3

切換目錄 all make multi req 交叉 gcc arm-linux 生成 1.環境: Ubuntu 16.04 2.獲取 wget mirrors.ustc.edu.cn/gnu/gcc/gcc-4.8.3/gcc-4.8.3.tar.bz2 3.解壓 tar

Centos7+hadoop2.7.3+jdk1.8

測試 master dfs- dfs con ssp 必須 加載 lib 修改主機名 1. 修改主機名 vi /etc/sysconfig/network ,改為 master , slave1 , slave2 2. source /etc/sy

Centos7 安裝hadoop2.7.3和jdk1.8

配置 tin source 沒有 https prope font col 配置環境變量 下載好hadoop和jdk軟件包,傳到虛擬機上 第一步:安裝jdk並配置環境變量 解壓 tar -xvf 配置環境變量 vim /etc/profile #set ja

[版本更新] LightningChart v.8.3 最新版重磅上線 -引入5個超實用的新功能-簡數科技

LightningChart 最新版v.8.3全新發布,本次版本釋出帶來了一些新的功能,包括網格模型,三角滑鼠追蹤,下面將一一為大家介紹: 功能一:   網格模型,三角滑鼠追蹤          &nb

wordpress主題iDowns版本:V1.8.3

本主題全部乾淨整潔,程式碼開源,可以自行隨意修改。完美適合WordPress虛擬資源分享下載站,或者其他的素材資源站點。感謝支援作者,如果您不是在本站下載的主題,關於安全等任何問題本人概不負責,也不要找我諮詢其他問題。 PHP版本建議5.6以上都可以。 現已加入普通使用者繫結QQ功能

編譯、安裝rdesktop 1.8.3

  來自:https://blog.csdn.net/songshimvp1/article/details/48290885   1、安裝GCC: 安裝C/C++編譯器apt-get install gcc gcc安裝相關構建工具apt-get install buil

harbour-offline-install-v1.2.2+etcd-V3.3.8+k8s-v1.11.3版本下載

harbour-offline-install-v1.2.2+etcd-v3.3.8+k8s-v1.11.3版本下載 1、harbour-offline-installer-v1.2.2.tgz下載 http://harbor.orientsoft.cn/harbor-1.2.2/harbor-

8.3(java學習筆記)動態編譯(DynamicCompiler)與動態執行(DynamicRun)

一、動態編譯   簡單的說就是在執行一個java程式的過程中,可以通過一些API來編譯其他的Java檔案。   下面主要說動態編譯的實現:   1、獲取java編譯編譯器   2、執行編譯器(須指定編譯檔案)   獲取編譯器通過JavaCompiler ToolProvider.getSystemJ

okhttp3 3.8.1版本 中類OkHttpClient.java的原始碼剖析學習

/* * Copyright (C) 2012 Square, Inc. *照貓畫虎 依葫蘆畫瓢 個人能力理解解釋,老外屬性和方法不寫功能註釋,這樣很不爽 這裡面是不是用到了一個設計模式叫構建者? (建議大家學習設計模式可以先學習概念然後再到實際開源專案中去尋找具體的例子,然後再結合設計

Ruby 在CentOS6 2系統上編譯ruby1 9 3版本出現錯誤 EC GROUP new curve GF2m

                詳細錯誤資訊:Error: ossl_pkey_ec.c:In functin 'ossl_ec_group_initialize': ossl_pkey_ec.c:816:error: 'EC_GROUP_new_curve_GF2m' undeclard (first us

hadoop2.7.3編譯,支援snappy、bzip2本地壓縮

軟體包: apache-ant-1.9.9-bin.tar.gz apache-maven-3.3.9-bin.tar.gz apache-tomcat-6.0.44.tar.gz CentOS-6.9-x86_64-minimal.iso findbugs-3.0.1.tar.gz hado