big-data-europe / docker-spark

Apache Spark docker image

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

spark task is not working

joyatcloudfall opened this issue · comments

I tried to use pyspark to do a count task.
image

The task was sent to spark, however, the task was not running.
image

Any idea what's going wrong?

commented

Hi,
I've submitted an application to master from remote server.
I got the same problem too.
Loop messages:
WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

Do you have any suggestions ?

Images: Spark 3.2.0 for Hadoop 3.2 with OpenJDK 8 and Scala 2.12
Thanks.

Hi, I am having the same issue, did it get resolved ?