SirOibaf / spark-chef

Apache Spark chef cookbook

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Apache Spark Chef cookbook

Install Spark standalone

Install Spark yarn

References

set "spark.yarn.jars" $ Cd $SPARK_HOME $ hadoop fs mkdir spark-2.0.0-bin-hadoop $hadoop fs -copyFromLocal jars/* spark-2.0.0-bin-hadoop $ echo "spark.yarn.jars=hdfs:///nameservice1/user//spark-2.0.0-bin-hadoop/*" >> conf/spark-defaults.conf

If you do have access to the local directories of all the nodes in your cluster you can copy the archive or spark jars to the local directory of each of the data nodes using rsync or scp. Just update the URLs from hdfs:/ to local:

About

Apache Spark chef cookbook

License:GNU Affero General Public License v3.0


Languages

Language:Ruby 60.6%Language:HTML 25.9%Language:Shell 13.5%