This is a MLC converted weight from Control-Nanuq-8B model in MLC format q4f16_1.

The model can be used for projects MLC-LLM and WebLLM.

Downloads last month
68
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support text-generation models for mlc-llm library.

Model tree for huggingkot/Control-Nanuq-8B-q4f16_1-MLC

Quantized
(3)
this model

Collection including huggingkot/Control-Nanuq-8B-q4f16_1-MLC