Skip to content

Latest commit

 

History

History
executable file
·
26 lines (13 loc) · 765 Bytes

README.md

File metadata and controls

executable file
·
26 lines (13 loc) · 765 Bytes

Apache Spark Chef cookbook

Install Spark standalone

Install Spark yarn

References

set "spark.yarn.jars" $ Cd $SPARK_HOME $ hadoop fs mkdir spark-2.0.0-bin-hadoop $hadoop fs -copyFromLocal jars/* spark-2.0.0-bin-hadoop $ echo "spark.yarn.jars=hdfs:///nameservice1/user//spark-2.0.0-bin-hadoop/*" >> conf/spark-defaults.conf

If you do have access to the local directories of all the nodes in your cluster you can copy the archive or spark jars to the local directory of each of the data nodes using rsync or scp. Just update the URLs from hdfs:/ to local: