Qwen2.5-32B

ExLlamav2 8 bpw quant of https://huggingface.co/Qwen/Qwen2.5-32B

Downloads last month
10
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Model tree for altomek/Qwen2.5-32B-8bpw-EXL2

Base model

Qwen/Qwen2.5-32B
Quantized
(69)
this model