Missing Config Files
#5
by
ashbo
- opened
Trying to serve it using vllm, I run into errors.
The Readme is not adapted to the quantization (still references the name of the unquantized model for all commands)
Also, this quantized version of the model does not come with any config files. Could you add them?