Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Onnx weights for embedding models #770

Open
1951FDG opened this issue Jan 11, 2024 · 0 comments
Open

Onnx weights for embedding models #770

1951FDG opened this issue Jan 11, 2024 · 0 comments

Comments

@1951FDG
Copy link

1951FDG commented Jan 11, 2024

Hello,

I hope this is the right place to ask, it would be nice to have the onnx weights added for the models specified below, just like was done for:
https://huggingface.co/jinaai/jina-embeddings-v2-base-en/tree/main

models:
jina-embedding-t-en-v1
jina-embedding-s-en-v1
jina-embedding-b-en-v1
jina-embedding-l-en-v1

But then putting the weights both quantized and not quantized in a folder named onnx just like:
https://huggingface.co/Xenova/jina-embeddings-v2-base-en

onnx
- model.onnx
- model_quantized.onnx

I will try to convert one myself (jina-embedding-s-en-v1) soon to see if it works, but it would be nice to have official weights to compare results.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant