Conversion to ONNX

#14
by imandel - opened

After finetuning according to the suggested collab notebook, how do I convert to ONNX for running with WebGPU?

Liquid AI org

That model works great! I was more curious what scripts are used to convert to ONNX? LFM2 does note seem to be currently supported by huggingface optimum. If I'm starting with the current LiquidAI/LFM2-350M model, or a finetune with a LoRA adapter, i then need to convert it to ONNX myself rather than using the existing model you linked.

Liquid AI org

@imandel optimum support is coming soon, the model was merged into executorch https://github.com/pytorch/executorch/pull/13805

@ykhrustalev any update on the optimum support?

Liquid AI org

@Dragonriders not exactly there , but check out this executorch example https://github.com/pytorch/executorch/tree/main/examples/models/lfm2

@ykhrustalev any ETA? or progress we can track?

Liquid AI org

@moogin https://huggingface.co/LiquidAI/LFM2.5-1.2B-Instruct-ONNX

LFM2 onnx versions will be released soon

@moogin https://huggingface.co/LiquidAI/LFM2.5-1.2B-Instruct-ONNX

LFM2 onnx versions will be released soon

Thank you for letting me know, can't wait πŸ’ͺ🏻

Liquid AI org

@moogin @imandel https://github.com/Liquid4All/onnx-export/

Deeply appreciate the reply 😊 love you guys at liquid, thanks for amazing modells and amazing work!

Sign up or log in to comment