You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Users are frequently receiving empty and undeterministic responses from the LLM. The responses are inconsistent and unpredictable, affecting the reliability of the output. To improve the quality of the responses, it is necessary to optimize the temperature and other hyperparameters to enhance determinism.
Expected Behavior
The LLM should provide consistent and predictable output. Responses should not be empty, and the results should be more deterministic.
Current Behavior
Possible Solution
Optimize the temperature and other hyperparameters to enhance determinism in the LLM output.
Explore providers API documentations for the fix.
Steps to Reproduce
Query the LLM through prompt provider.
Observe that some responses are empty.
Note the variability in responses for similar inputs.
Context (Environment)
Application
VSCode (All versions)
Detailed Description
Possible Implementation
The text was updated successfully, but these errors were encountered:
Bug Report 🐛
Users are frequently receiving empty and undeterministic responses from the LLM. The responses are inconsistent and unpredictable, affecting the reliability of the output. To improve the quality of the responses, it is necessary to optimize the temperature and other hyperparameters to enhance determinism.
Expected Behavior
The LLM should provide consistent and predictable output. Responses should not be empty, and the results should be more deterministic.
Current Behavior
Possible Solution
Optimize the temperature and other hyperparameters to enhance determinism in the LLM output.
Explore providers API documentations for the fix.
Steps to Reproduce
Context (Environment)
Application
Detailed Description
Possible Implementation
The text was updated successfully, but these errors were encountered: