Memory limit issue
jyh6681 opened this issue · comments
In the submission page of the predicted data, the error occurs"The container was killed as it exceeded the memory limit of 4g."
Does that means the cpu memory are not allowed to exceed 4G during the running of a docker?
I only submit the predicted file and I think there is no need to run a docker file, since the evaluation of the predicted file is to get all the metric value. Is that right?
Hi @jyh6681
It is problem of grand-challenge when computing NSD.
You do not need to submit on the grand-challenge. We will get back to you with the results based on your docker.
Thanks for your reply. In the next few days, we only need to give the docker file to you and there is no need to pack and deliver the predicted data file. Is that right?
Another question is that every team have 5 submission opportunities, is that means the opportunity of offer 5 docker file to your mailbox?
-
Yes.
-
The five opportunities are during the validation phase. Now, it is the testing phase. Each team only has one chance.
Understood