-
-
Notifications
You must be signed in to change notification settings - Fork 102
Add support for API Key Per Request #66
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Conversation
@BrianBorge thanks! chat = RubyLLM.chat(model: 'claude-3-7-sonnet', api_key: 'my_api_key') What do you think about including that? |
@pricetodd - thanks for bringing that up -- just included it. |
Thanks for taking this on! While I appreciate the implementation, I think we can make this API sing with a bit more Ruby magic. Instead of adding one-off parameters, let's embrace the beauty of configuration blocks and our existing chainable interface. Here's what the API should look like: # Beautiful per-instance configuration that matches our global style
chat = RubyLLM.chat.with_config do |config|
config.anthropic_api_key = "different_key"
config.request_timeout = 60
end
# Chainable like everything else
RubyLLM.embed("text").with_config do |config|
config.openai_api_key = "different_key"
end This feels much more Ruby-like and matches our existing methods like Would you like to revise the PR to implement this pattern? The key pieces would be:
Let me know if you'd like guidance on any of those pieces. Looking forward to seeing this land with the new pattern! /cc #55 |
Great, love it. This approach makes sense. It's easy for me to tunnel vision and do the quick/easy thing when I see an issue so I appreciate the nudge towards the more elegant solution.
Yes. I'll update the PR -- thanks for the guidance. I'll re-request a review once I have a draft of the revised implementation. |
c547dee
to
623098d
Compare
@crmne – I think we may want to consider changing the internal implementation and API of one-off methods like Right now, I’d love to hear your thoughts on which direction you’d prefer before I implement one or the other for the one-off methods. Example of the one-off style with
|
Hey @BrianBorge! I love how this conversation is getting at the heart of configuration patterns in Ruby. Let me add my 2¢ on this. Rather than getting into competing styles of one-off instance configs, I think we should elevate the abstraction. What we really want here is a first-class Context object that can carry its own complete configuration state. Here's what I'm thinking: # Global defaults work as before
RubyLLM.configure do |config|
config.openai_api_key = ENV['OPENAI_API_KEY']
end
# But now we can create isolated contexts with their own config
context = RubyLLM.context do |config|
config.openai_api_key = "team_a_key"
config.request_timeout = 60
end
# Each context works independently
context.chat(...)
context.embed("Hello")
# Different contexts don't interfere
context_a = RubyLLM.context { |c| c.openai_api_key = "team_a_key" }
context_b = RubyLLM.context { |c| c.anthropic_api_key = "team_b_key" }
# Perfect for concurrent usage
Fiber.new { context_a.chat(...) }.resume
Fiber.new { context_b.chat(...) }.resume This gives us:
The core idea here is that instead of trying to bolt configuration onto individual operations, we make the configuration context a first-class citizen that knows how to perform operations. What do you think? This feels like a more "whole solution" that solves both the immediate need and gives us a better foundation for the future. |
Thread safety and future-proofing? love it. I'll re-request a review once I have a revised implementation. Again, thanks for your guidance. |
made my own implementation in 5e73fe3 |
This PR addresses #55 by adding support for passing an API Key per request.
Proposed API (and implemented in PR)
Embedding Example
Paint Example
Questions/Notes