When input image is 480 * 960, there is CUDA out of memory error.
wtt5220150 opened this issue · comments
When input image is 480 * 960, there is CUDA out of memory error.
Hello, can you provide the GPU model, VRAM, and specific details of the error message?
We use V100, VGAM is 32G. When debugging with breakpoints, CUDA out of memory error occurred "latents_condition_image = self.vae.encode(image*2-1).latent_dist.sample()"
Are there any restrictions on input resolution?
Are there any restrictions on input resolution?
Hello, we have updated vae_encoder tile function. It would be need 13G VRAM when vae_encoder tile size is set to 1024.
Thanks ,the problem is solved
Thanks ,the problem is solved
Thank you also for providing the cases that helped us improve the code.