Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Inconsistency with generated model interfaces #2942

Closed
ohltyler opened this issue Sep 13, 2024 · 1 comment · Fixed by #2962
Closed

[BUG] Inconsistency with generated model interfaces #2942

ohltyler opened this issue Sep 13, 2024 · 1 comment · Fixed by #2962
Assignees
Labels
bug Something isn't working

Comments

@ohltyler
Copy link
Member

ohltyler commented Sep 13, 2024

I've run into inconsistencies in the documentation and functionality of the model interfaces.

The documentation states:

To simplify your workflow, you can register an externally hosted model using a connector in one of the connector blueprint formats. If you do so, a predefined model interface for this connector is generated automatically during model registration. The predefined model interface is generated based on the connector blueprint and the model's metadata, so you must strictly follow the blueprint when creating the connector in order to avoid errors.

The first thing I'm wondering, is what does it mean to "strictly follow the blueprint" and what can be changed/customized in the blueprint, in order to still trigger the automatic interface generation?

Secondly, is there a library of suggested/preset model interfaces?

An example of the ambiguity and issues I'm facing: the second connector example here has a request body containing parameters.prompt but not parameters.inputs anywhere. Yet the default interface generated from this requires inputs but not prompt.

Interface generated:

"interface": {
    "output": """{
    "type": "object",
    "properties": {
        "inference_results": {
            "type": "array",
            "items": {
                "type": "object",
                "properties": {
                    "output": {
                        "type": "array",
                        "items": {
                            "type": "object",
                            "properties": {
                                "name": {
                                    "type": "string"
                                },
                                "dataAsMap": {
                                    "type": "object",
                                    "properties": {
                                        "response": {
                                            "type": "string"
                                        }
                                    },
                                    "required": [
                                        "response"
                                    ]
                                }
                            },
                            "required": [
                                "name",
                                "dataAsMap"
                            ]
                        }
                    },
                    "status_code": {
                        "type": "integer"
                    }
                },
                "required": [
                    "output",
                    "status_code"
                ]
            }
        }
    },
    "required": [
        "inference_results"
    ]
}""",
    "input": """{
    "type": "object",
    "properties": {
        "parameters": {
            "type": "object",
            "properties": {
                "inputs": {
                    "type": "string"
                }
            },
            "required": [
                "inputs"
            ]
        }
    },
    "required": [
        "parameters"
    ]
}"""
  }

Error when executing predict (not surprising, since prompt is needed in the request body and not listed under the default parameters in the connector blueprint)

{
  "error": {
    "root_cause": [
      {
        "type": "illegal_argument_exception",
        "reason": "Some parameter placeholder not filled in payload: prompt"
      }
    ],
    "type": "illegal_argument_exception",
    "reason": "Some parameter placeholder not filled in payload: prompt"
  },
  "status": 400
}

The automatically generated interfaces should be consistent & dependent on the connector blueprints' required parameters, such as to not cause added confusion when inputs satisfy the model interface, yet fail when hitting the actual prediction.

@ohltyler ohltyler added the bug Something isn't working label Sep 13, 2024
@b4sjoo b4sjoo removed the untriaged label Sep 16, 2024
@b4sjoo
Copy link
Collaborator

b4sjoo commented Sep 16, 2024

Nice catch, we will fix the blueprint accordingly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants