Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TextLLM: Add token usage to response meta #338

Merged
merged 1 commit into from
Sep 27, 2024

Conversation

FelixTJDietrich
Copy link
Collaborator

@FelixTJDietrich FelixTJDietrich commented Sep 4, 2024

Motivation and Context

We want to track token usage on the LMS side.

Description

Add total_usage and individual llm_calls usages together with (OpenAI) model name.

Steps for Testing

Request feedback generation on the playground using the text llm module.

Screenshots

Screenshot 2024-09-04 at 18 47 51

Copy link
Contributor

@dmytropolityka dmytropolityka left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good

@maximiliansoelch maximiliansoelch changed the title Add token usage to response meta TextLLM: Add token usage to response meta Sep 27, 2024
@maximiliansoelch maximiliansoelch merged commit 5c2eae4 into develop Sep 27, 2024
14 checks passed
@maximiliansoelch maximiliansoelch deleted the feature/add-usage-to-meta branch September 27, 2024 11:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants