Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Invalid JSON in payload error despite sending a valid JSON with all required parameters #1872

Closed
NeuralFlux opened this issue Jan 15, 2024 · 5 comments
Assignees
Labels
bug Something isn't working

Comments

@NeuralFlux
Copy link

What is the bug?
Sending a predict request to a model that uses SageMaker connector like so

POST /_plugins/_ml/models/_6gdD40BZqSAbrEiV6DT/_predict
{
  "parameters": {
    "inputs": "test sentence"
  }
}

produces an error

{
  "error": {
    "root_cause": [
      {
        "type": "illegal_argument_exception",
        "reason": "Invalid JSON in payload"
      }
    ],
    "type": "illegal_argument_exception",
    "reason": "Invalid JSON in payload"
  },
  "status": 400
}

despite having a complete and valid JSON.

How can one reproduce the bug?
Steps to reproduce the behavior:

  1. Setup a connector to a SageMaker endpoint (any endpoint is okay since the error is in the connector itself)
  2. Deploy the model and note the model ID
  3. Predict curl -XPOST "http://localhost:9200/_plugins/_ml/models/<Model ID>/_predict" -H 'Content-Type: application/json' -d' { "parameters": { "inputs": "test sentence" } }'
  4. See error

What is the expected behavior?
The request should trigger processing of request_body in the connector without any errors.

What is your host/environment?

  • AWS OpenSearch Service 2.11
  • ML Commons

Do you have any additional context?
Passing "inputs": ["test sentence"] works. However, I need the embedding on just the sentence without the extra square brackets. Moreover, ${parameters.input}[0] works without any errors but gave different embedding for a test sentence than what was expected.

@NeuralFlux NeuralFlux added bug Something isn't working untriaged labels Jan 15, 2024
@ramda1234786
Copy link

Hi @NeuralFlux works well for me. May be some problem in request_body while creating connector

It should be

"request_body": "{ \"inputs\": \"${parameters.inputs}\" }"

image

@NeuralFlux
Copy link
Author

NeuralFlux commented Jan 16, 2024

I realized a JSON has different standards for being "valid". May I know which standard is used to check the payload? Plain strings are valid JSON documents according to RFC 7159 and RFC 8259.

@b4sjoo
Copy link
Collaborator

b4sjoo commented Jan 19, 2024

Hi @NeuralFlux, could you please share your connector configuration?

@b4sjoo b4sjoo assigned b4sjoo and Zhangxunmt and unassigned b4sjoo Jan 19, 2024
@b4sjoo b4sjoo removed the untriaged label Jan 19, 2024
@NeuralFlux
Copy link
Author

Sure thing, it's

"actions": [
      {
         "action_type": "predict",
         "method": "POST",
         "headers": {
            "content-type": "application/x-text"
         },
         "url": "<INFERENCE_ENDPOINT>",
         "request_body": "${parameters.inputs}"
      }
   ]

@b4sjoo b4sjoo moved this from Untriaged to On-deck in ml-commons projects Feb 13, 2024
@mingshl
Copy link
Collaborator

mingshl commented Jul 2, 2024

this might be resolved using model interface #2354

@mingshl mingshl closed this as completed Jul 2, 2024
@github-project-automation github-project-automation bot moved this from On-deck to Done in ml-commons projects Jul 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Development

No branches or pull requests

5 participants