How to create decoder_model_merged.onnx models
#2
by
lakpriya
- opened
I'm using this script to convert my fine-tuned models to onnx formats.
https://huggingface.co/spaces/onnx-community/convert-to-onnx/tree/main
python3 /content/convert.py \
--quantize \
--task text-generation \
--model_id <model>
It creates all kinds of quantized models, such as model.onnx
, model_int8.onnx
, etc., but I see that the transformer js
demo uses decoder_model_merged.onnx
. I want to know how we can create them. Could you please help me with this?