-
Notifications
You must be signed in to change notification settings - Fork 143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Doc] ML_inference processor X RerankByField tutorial needed #3175
Closed
brianf-aws opened this issue
Oct 28, 2024
· 3 comments
· Fixed by opensearch-project/documentation-website#8694
Closed
[Doc] ML_inference processor X RerankByField tutorial needed #3175
brianf-aws opened this issue
Oct 28, 2024
· 3 comments
· Fixed by opensearch-project/documentation-website#8694
Comments
Can I assign this issue to you? |
yep! |
brianf-aws
changed the title
[Doc] Ml_inference processor X RerankByField tutorial needed
[Doc] ML_inference processor X RerankByField tutorial needed
Oct 28, 2024
Closing this issue as the doc PR has been merged |
github-project-automation
bot
moved this from In Progress
to Done
in ml-commons projects
Dec 5, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
In Neural Search 2.18 it introduces rerank-by-field search response processor
We need to have a tutorial that shows how the ml-inference search response processor, can be used with the new rerank type in order to rank their documents according to an any external model! This is not to be confused with the current ml_opensearch rerank type that reranks using a cross encoder model only.
This action needs to be done in the OpenSearch/doc-website repo
Here is a general workflow I can think of to describe this solution although I would like more discussion of how to bring it up.
I still have to think of an interesting scenario that a user may want to see.
The text was updated successfully, but these errors were encountered: