NVIDIA GeForce RTX 4090 24G 能跑推理吗?OutOfMemoryError
missu263 opened this issue · comments
聂琦 commented
推理报错了,已经将 batch_size 设置为1
python -m scripts.inference --inference_config configs/inference/test.yaml --batch_size 1
仍然报错
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 58.00 MiB (GPU 0; 23.64 GiB total capacity; 5.14 GiB already allocated; 38.62 MiB free; 5.24 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
ak01user commented
是不是因为视频过长 或者 推理音频过长?