big-data-europe / docker-spark

Apache Spark docker image

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Example ENABLE_DAEMON_INIT submit step

noahkawasakigoogle opened this issue · comments

Hello, this is a feature request, for those of us not experts in how Spark works, to add in an example of an additional step to run with the ENABLE_DAEMON_INIT flag.

For myself in particular, I am hoping this could be used to start the HiveThrift server upon starting the containers so I can use this for JDBC testing in CI from outside of the container as well as a simple ETL script to create some tables and load a small dataset.

I have found that this command lets me start up the thrift server:

cd /spark/bin && /spark/sbin/../bin/spark-class org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal

So I will probably docker exec it before starting the test runner. But it would be cool to see if this is possible to include in the initialization of the container.

Thanks