Failed with UMFPACK_ERROR_out_of_memory
ycjing opened this issue · comments
Thanks for the great code. When I run the algorithm with my own high-resolution images (655 * 1280), I find that when using scipy.sparse.linalg.spsolve with scikit-umfpack as solver, it requires too much memory (larger than 128GB).
After some investigations, I found the problem might be OS dependent. However, I actually followed the instructions: my OS is Ubuntu 16.04, also the same CUDA and python version.
I wonder if anyone struggles at the same issue with me, and if there is any other solver. Thanks.
Try uninstalling scikit-umfpack (with pip). I sometimes found the default sparse solver could handle larger images. You will find it's considerably slower though.
@wesleyw72 Thanks! But it is far too slow, especially for high-resolution images :(
@Yijunmaverick Really appreciate the help! Thanks!
@Yijunmaverick Really appreciate the help! Thanks!
dear@ycjing I also have this problem ,can you show me how to handle it ! very thanks!
BTW
Hi @785256592 ,
Since it has been almost two years since I worked on this project, I hardly remember the problem and the solution. Really sorry for that.
@Yijunmaverick Really appreciate the help! Thanks!
dear@ycjing I also have this problem ,can you show me how to handle it ! very thanks!
BTW
In my case, simply uninstalling scikit-umfpack worked.
@Yijunmaverick Really appreciate the help! Thanks!
dear@ycjing I also have this problem ,can you show me how to handle it ! very thanks!
BTWIn my case, simply uninstalling scikit-umfpack worked.
Thank you! Uninstalling scikit-umfpack worked for me too.