Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Empty and Undeterministic Responses from LLM #50

Open
kshitij79 opened this issue Aug 14, 2024 · 0 comments
Open

Empty and Undeterministic Responses from LLM #50

kshitij79 opened this issue Aug 14, 2024 · 0 comments

Comments

@kshitij79
Copy link
Contributor

Bug Report 🐛

Users are frequently receiving empty and undeterministic responses from the LLM. The responses are inconsistent and unpredictable, affecting the reliability of the output. To improve the quality of the responses, it is necessary to optimize the temperature and other hyperparameters to enhance determinism.

Expected Behavior

The LLM should provide consistent and predictable output. Responses should not be empty, and the results should be more deterministic.

Current Behavior

Possible Solution

Optimize the temperature and other hyperparameters to enhance determinism in the LLM output.
Explore providers API documentations for the fix.

Steps to Reproduce

  1. Query the LLM through prompt provider.
  2. Observe that some responses are empty.
  3. Note the variability in responses for similar inputs.

Context (Environment)

Application

  • VSCode (All versions)

Detailed Description

Possible Implementation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant