Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update LLM frontend docs to cover unconfigured status #363

Open
jtheory opened this issue May 24, 2024 · 1 comment
Open

Update LLM frontend docs to cover unconfigured status #363

jtheory opened this issue May 24, 2024 · 1 comment
Labels
documentation Improvements or additions to documentation good first issue Good for newcomers

Comments

@jtheory
Copy link
Contributor

jtheory commented May 24, 2024

Our readme here doesn't yet cover how to handle the different LLM app config options, including the new "unconfigured" status added in #350 .

The general UX approach we'd like to model is:

  • activate LLM features if they're enabled (either Grafana-provided or other config)
  • hide LLM features from the user if they are disabled / opted-out
  • prompt the user that an admin can opt in to LLM-powered features in the LLM plugin if status is unconfigured.

Once our readme has a trivial example, I can start nudging different squads to use this approach (plus get some UX input on how they should do it).

@sd2k sd2k added documentation Improvements or additions to documentation good first issue Good for newcomers labels May 30, 2024
@sd2k
Copy link
Contributor

sd2k commented Jun 4, 2024

Some example responses that may be useful here:

Plugin disabled:

{
  "message": "Plugin not found"
}

Plugin enabled but not configured:

{
  "details": {
    "openAI": {
      "configured": false,
      "error": "No functioning models are available",
      "models": {
        "base": {
          "error": "OpenAI not configured",
          "ok": false
        },
        "large": {
          "error": "OpenAI not configured",
          "ok": false
        }
      },
      "ok": false
    },
    "vector": {
      "enabled": false,
      "ok": false
    },
    "version": "0.10.4"
  },
  "message": "",
  "status": "OK"
}

Plugin enabled but "Disable all LLM features in Grafana" chosen (i.e. opted-out):

{
  "details": {
    "openAI": {
      "configured": true,
      "error": "LLM functionality is disabled",
      "models": null,
      "ok": false
    },
    "vector": {
      "enabled": false,
      "ok": false
    },
    "version": "0.10.4"
  },
  "message": "",
  "status": "OK"
}

The @grafana/llm has two relevant functions:

  • health, which more or less returns the openAI object in the above response
  • enabled, which I think most people are using, which calls health and checks that configured and ok are both true.

It sounds like we want to encourage people to use health and use the value of configured to prompt the user to opt-in?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants