1. 程式人生 > >spark 學習筆記-spark2.2.0

spark 學習筆記-spark2.2.0

submit -- org hdf doc kpi jdk profile apach


master:192.168.11.2
s1:192.168.11.3
s2 :192.168.11.4
共三個節點
第一步配置(三臺一樣) http://hadoop.apache.org/docs/r2.7.4/hadoop-project-dist/hadoop-common/ClusterSetup.html
1> etc/hadoop/core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://master:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/data/hadoop-2.7.4/hadoop-${user.name}</value>
</property>
</configuration>
2> etc/hadoop/slave
master
s1
s2
3> vi /etc/profile.d/env.sh
加入 export HADOOP_CONF_DIR=/home/spark/bd/hadoop-2.7.4/etc/hadoop
export JAVA_HOME=/opt/jdk
jps 查看是否啟動
解決問題看日誌
http://spark.apache.org/docs/latest/running-on-yarn.html

./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --driver-memory 4g --executor-cores 2 examples/jars/spark-examples*.jar 10

spark 學習筆記-spark2.2.0