spark task is not working
joyatcloudfall opened this issue · comments
joyatcloudfall commented
Riven commented
Hi,
I've submitted an application to master from remote server.
I got the same problem too.
Loop messages:
WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
Do you have any suggestions ?
Images: Spark 3.2.0 for Hadoop 3.2 with OpenJDK 8 and Scala 2.12
Thanks.
kiran-jayaram commented
Hi, I am having the same issue, did it get resolved ?