您好,登錄后才能下訂單哦!
16/03/04 00:21:09 WARN SparkContext: Using SPARK_MEM to set amount of memory to use per executor process is deprecated, please use spark.executor.memory instead.
16/03/04 00:21:09 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Could not parse Master URL: 'at'
at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2554)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:489)
at com.bigdata.deal.scala.DomainLib$.main(DomainLib.scala:22)
at com.bigdata.deal.scala.DomainLib.main(DomainLib.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
配置conf時,務必要有 sparkHome master地址
注意在spark-env.sh中要配置這些,就可以了
[root@mini-cp1 spark-1.4.0-bin-hadoop2.6]# cat conf/spark-env.sh
#!/usr/bin/env bash
SPARK_MASTER_IP=mini-cp1
#必須導入JAVA根目錄路徑
export JAVA_HOME=/usr/local/jdk1.7.0_65
export HADOOP_HOME=/usr/local/hadoop-2.6.0
#export SCALA_HOME=/opt/scala
export SPARK_WORKER_MEMORY=3g
export HADOOP_CONF_DIR=/usr/local/hadoop-2.6.0/etc/hadoop
#SPARK_MEM=${SPARK_MEM:-1g}
export SPARK_MEM=3g
export HADOOP_HOME=/usr/local/hadoop-2.6.0
export HADOOP_COMMON_LIB_NATIVE_DIR=/usr/local/hadoop-2.6.0/lib/native
export HADOOP_OPTS="-Djava.library.path=/usr/local/hadoop-2.6.0/lib"
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。