big-data-europe / docker-spark

Apache Spark docker image

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

how to join a spark-master from a different node?

AnakinSkyCN opened this issue · comments

Hi,

I've read the samples and it seems that the docker-compose.yaml create both master and workers in the same node, which is constrained by the node.

How am supposed to write a docker-compose.yaml to join an existing master from a different node?
I've tried delete the "spark-master" section to avoid starting another master, but the workers won't work without a known "depend-on" master. If I keep the master section, the worker node will start a new master, which is not anticipated.

So, how can I join the master without repeatedly define a master?

Thank you.

Hi,

I also have this question. How can we join workers from other nodes?

Thanks.

Do you have your master node's container running? If so, I believe you can make an image from the spark-base or worker, build it, and run it on the same network with your master node