"ValueError: Trying to export a codesage model" while trying to export codesage/codesage-large #2080
Open
1 of 4 tasks
Labels
bug
Something isn't working
System Info
Who can help?
@michaelbenayoun
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction (minimal, reproducible, runnable)
This is a PyTorch embedding model released by AWS, as described here: https://www.linkedin.com/posts/changsha-ma-9ba7a485_yes-code-needs-its-own-embedding-models-activity-7163196644258226176-bFSW
Hoping I can use it with RAG under ollama for code understanding.
The error: "ValueError: Trying to export a codesage model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as
custom_onnx_configs
. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type codesage to be supported natively in the ONNX export."I am grateful for any help you can provide!
Expected behavior
An exported ONNX file.
The text was updated successfully, but these errors were encountered: