Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

add support for TokenUsage through callbacks #137

Merged
merged 7 commits into from
Jun 12, 2024
Merged

Conversation

brainlid
Copy link
Owner

  • upgrades to Req v0.5.0 with new error returns
  • added LangChain.TokenUsage for tracking input and output tokens for a chat completion
  • added new callback :on_llm_token_usage to notify when token information is received
  • support added in ChatOpenAI, ChatAnthropic, and ChatBumblebee

Token information is returned in different ways from each model. The biggest challenge is how it's returned with streaming deltas. More extensive changes were needed to make it work well.

- added OpenAI.stream_options support
- support TokenUsage on streamed and not streamed
- support skipping a parsed message
- anthropic updates for updated req version
- fire token usage callback
- for new callbacks
- added support TokenUsage
@brainlid brainlid merged commit 9fc74c9 into main Jun 12, 2024
1 check failed
@brainlid brainlid deleted the me-usage-token-callbacks branch June 12, 2024 04:11
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant