Qwen2.5-0.5B

ExLlamav2 8 bpw quant of https://huggingface.co/Qwen/Qwen2.5-0.5B

Downloads last month
3
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for altomek/Qwen2.5-0.5B-8bpw-EXL2

Quantized
(101)
this model