您好,登錄后才能下訂單哦!
1、下載 spark
http://mirrors.cnnic.cn/apache/spark/spark-1.3.0/spark-1.3.0-bin-hadoop2.3.tgz
2、下載scala
http://www.scala-lang.org/download/2.10.5.html
3、安裝scala
mkdir /usr/lib/scala
tar –zxvf scala-2.10.5.tgz
mv scala-2.10.5 /usr/lib/scala
4、設置scala路徑
vim /etc/bashrc
export SCALA_HOME=/usr/lib/scala/scala-2.10.5
export PATH=$SCALA_HOME/bin:$PATH
source /etc/bashrc
scala –version
5、分發
scp -r /usr/lib/scala/ hd2:/usr/lib/scala
scp -r /usr/lib/scala/ hd3:/usr/lib/scala
scp -r /usr/lib/scala/ hd4:/usr/lib/scala
scp -r /usr/lib/scala/ hd5:/usr/lib/scala
scp /etc/bashrc hd2:/etc/bashrc
scp /etc/bashrc hd3:/etc/bashrc
scp /etc/bashrc hd4:/etc/bashrc
scp /etc/bashrc hd5:/etc/bashrc
6、安裝spark
tar -zxvf spark-1.3.0-bin-hadoop2.3.tgz
mkdir /usr/local/spark
mv spark-1.3.0-bin-hadoop2.3 /usr/local/spark
vim /etc/bashrc
export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3
export PATH=$SCALA_HOME/bin:$SPARK_HOME/bin:$PATH
source /etc/bashrc
cd /usr/local/spark/spark-1.3.0-bin-hadoop2.3/conf/
cp spark-env.sh.template spark-env.sh
vim spark-env.sh
export JAVA_HOME=/java
export SCALA_HOME=/usr/lib/scala/scala-2.10.5
export SPARK_HOME=/usr/local/spark/spark-1.3.0-bin-hadoop2.3
export SPARK_MASTER_IP=192.168.137.101
export SPARK_WORKER_MEMORY=10g
export SPARK_DRIVER_MEMORY=9g
export HADOOP_CONF_DIR=/home/hadoop/hadoop/etc/hadoop
export SPARK_LIBRARY_PATH=$SPARK_HOME/lib
export SCALA_LIBRARY_PATH=$SPARK_LIBRARY_PATH
cp slaves.template slaves
vim slaves
hd1
hd2
hd3
hd4
hd5
7、分發
scp /etc/bashrc hd2:/etc
scp /etc/bashrc hd3:/etc
scp /etc/bashrc hd4:/etc
scp /etc/bashrc hd5:/etc
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd2:/usr/local/spark/
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd3:/usr/local/spark/
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd4:/usr/local/spark/
scp -r /usr/local/spark/spark-1.3.0-bin-hadoop2.3 hd5:/usr/local/spark/
7、啟動
在hd1,啟動
cd $SPARK_HOME/sbin
./start-all.sh
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。