-
-
Notifications
You must be signed in to change notification settings - Fork 109
Feat: Perplexity support #42
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
base: main
Are you sure you want to change the base?
Feat: Perplexity support #42
Conversation
Please, let me know if there is something missing on this first implementation Question: How can I run tests locally? |
@joaoGabriel55 I started a pull request here. Do you want to either tweak what I have, or incorporate my changes into your branch? |
@adenta let me know if there is something to improve |
@gquaresma-godaddy we need tests! Want to take a crack at it? |
Sure! I will work on it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks very good. It looks like sonar
has support for vision, so it would need to be added to the chat_content_spec
vision test, and the chat_streaming_spec
.
Two new features to consider in your provider implementation:
|
Added configuration requirements handling in 75f99a1 Each provider now specifies what configuration is required via a simple
Example of the new error messages: RubyLLM::ConfigurationError: anthropic provider is not configured. Add this to your initialization:
RubyLLM.configure do |config|
config.anthropic_api_key = ENV['ANTHROPIC_API_KEY']
end |
@joaoGabriel55 is this still on your radar? I'd love to merge Perplexity support soon. Whenever you're ready, could you resolve the conflicts and request a review? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for your work!
It looks generally good, I just need some changes and tests. Please pick the cheapest model to test with.
hi @joaoGabriel55 are you able to provide some VCRs for this? |
…aoGabriel55/ruby_llm into feat/add-perplexity-provider
5) RubyLLM::Chat function calling perplexity/sonar can use tools with multi-turn streaming conversations
Failure/Error: raise UnsupportedFunctionsError, "Model #{@model.id} doesn't support function calling"
RubyLLM::UnsupportedFunctionsError:
Model sonar doesn't support function calling
# ./lib/ruby_llm/chat.rb:50:in 'RubyLLM::Chat#with_tool'
# ./spec/ruby_llm/chat_tools_spec.rb:110:in 'block (4 levels) in <top (required)>'
# ./spec/spec_helper.rb:86:in 'block (3 levels) in <top (required)>'
# /Users/quaresma/.rvm/gems/ruby-3.4.1/gems/vcr-6.3.1/lib/vcr/util/variable_args_block_caller.rb:9:in 'VCR::VariableArgsBlockCaller#call_block'
# /Users/quaresma/.rvm/gems/ruby-3.4.1/gems/vcr-6.3.1/lib/vcr.rb:194:in 'VCR#use_cassette'
# ./spec/spec_helper.rb:85:in 'block (2 levels) in <top (required)>'
# /Users/quaresma/.rvm/gems/ruby-3.4.1/gems/webmock-3.25.1/lib/webmock/rspec.rb:39:in 'block (2 levels) in <top (required)>' Any ideas about that 😅 ? This is the sonar model config from the {
"id": "sonar",
"created_at": null,
"display_name": "Sonar",
"provider": "perplexity",
"context_window": 128000,
"max_tokens": 4096,
"type": "chat",
"family": "sonar",
"supports_vision": true,
"supports_functions": false,
"supports_json_mode": true,
"input_price_per_million": 1.0,
"output_price_per_million": 1.0,
"metadata": {
"description": "Lightweight offering with search grounding, quicker and cheaper than Sonar Pro."
}
}, |
Issue
#20
Description
This PR consists on add Perplexity API support