-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] mapper_parsing_exception when ingesting a nested knn vector from a remote model #2995
Comments
Tested, this should work. Don't need to configure
Model output is
If use this in output mapping
That's not expected input for So we should use
Then remove the
You don't need to configure this painless processor if you want to keep the |
@ylwu-amzn Thanks, this worked! I will be updating ml_inference OS docs as it's a bit confusing what the ml_inference processor is expecting as an input |
Thanks @IanMenendez , can you share the OS doc change issue and PR link here. In case someone else has similar issue, they can refer to your OS doc issue and PR . |
What is the bug?
I am trying to ingest nested KNN vectors into my index.
To do this I use an ml_inference processor with an API of my own connected with a remote ML model.
But the workflow fails with a parsing exception when indexing the document
How can one reproduce the bug?
3.Register and deploy the remote ML model
Additional information
I even tried using a post_process_function with the connector but it failed with the same exception
post_process_function I tried with:
What is the expected behavior?
Should not fail and ingest the document
What is your host/environment?
The text was updated successfully, but these errors were encountered: