Always Getting "Out of VRAM" Error (Unable to allocate)
VRAM is full. Use a smaller model, such as small, medium, or base, instead of large-v3. The minimum VRAM requirement for large-v3 is 8GB, but this does not guarantee smooth operation on an 8GB GPU. Other software also consumes VRAM simultaneously, and larger videos require more VRAM. When you encounter this error, please try:
- Use a smaller model, such as
small,medium, orbase. - If you still want to use a large model, choose the "Pre-segment" or "Equal Split" option.
- Modify the settings under Menu Bar -> Tools/Options -> Advanced Options.
CUDA Data Type = Change `float32` to `int8`. If an error occurs, change it to `float16`.
beam_size = 5 -> Change `5` to `1`.
best_of = 5 -> Change `5` to `1`.
Context = Change `true` to `false`.