Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing LLMEvent returns and streaming logic #181

Merged
merged 17 commits into from
May 3, 2024
Merged

Conversation

HowieG
Copy link
Contributor

@HowieG HowieG commented May 3, 2024

📥 Pull Request

📘 Description
Fixes ENG-332
Previously we were not returning the response for LLMEvents which we need for troubleshooting. Our streaming logic was also incomplete

🧪 Testing
_test_handler_openai_v0.py
_test_handler_openai_v1.py

@HowieG HowieG merged commit 562a1ac into main May 3, 2024
2 checks passed
@HowieG HowieG deleted the eng-332-fix-returns branch May 3, 2024 18:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants