big-data-europe / docker-spark

Apache Spark docker image

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

do not start worker

0neday opened this issue · comments

show

bash-5.0# ./start-worker.sh    spark://spark:7077 
rsync from spark://spark:7077
/spark/sbin/spark-daemon.sh: line 177: rsync: command not found
starting org.apache.spark.deploy.worker.Worker, logging to /spark/logs/spark--org.apache.spark.deploy.worker.Worker-1-6f7782b9b0d5.out
ps: unrecognized option: p
BusyBox v1.30.1 (2020-05-30 09:44:53 UTC) multi-call binary.

Usage: ps [-o COL1,COL2=HEADER]

Show list of processes

        -o COL1,COL2=HEADER     Select columns for display
ps: unrecognized option: p
BusyBox v1.30.1 (2020-05-30 09:44:53 UTC) multi-call binary.

Usage: ps [-o COL1,COL2=HEADER]

Show list of processes

Hi @0neday ,

um, could you maybe tell me a bit more about how you are trying to start a worker? From which image? If you have set it up via our example docker-compose file then the worker is automatically started or even via a normal docker-run using our worker image.

Do let me know a bit more so that we can resolve this.

Best regards,