Note : The latest model has been released as GGUF. Please download from Civitai.
Original Model
Quantization Method
- Dequant FP8 using https://github.com/Kickbub/Dequant-FP8-ComfyUI
- Quantize GGUF using https://github.com/city96/ComfyUI-GGUF/tree/main/tools
Versions
Check the branch for the previous models.
- Downloads last month
- 10,982
Hardware compatibility
Log In to add your hardware
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support