CUDA out of memory. How to deal with GPU memory shortage

1 min

language: ja bn en es hi pt ru zh-cn zh-tw

Errors that occur due to out of memory in Stable Diffusion.

In my case, the following message appeared.

CUDA out of memory. Tried to allocate 3.35 GiB (GPU 0; 8.00 GiB total capacity; 5.63 GiB already allocated; 843.24 MiB free; 5.65 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

Edit webui-user.bat, and if you haven't installed xformers, install it.
It seems to happen even when using VAE, so let's add --no-half-vae.

set COMMANDLINE_ARGS=–xformers --no-half-vae

End.

Related Posts