Skip to content

Allow use new capabilities when use assume_exists #350

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

victorlcampos
Copy link

@victorlcampos victorlcampos commented Aug 15, 2025

What this does

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Related issues

@@ -66,7 +66,7 @@ def resolve(model_id, provider: nil, assume_exists: false, config: nil) # ruboco
id: model_id,
name: model_id.tr('-', ' ').capitalize,
provider: provider_instance.slug,
capabilities: %w[function_calling streaming],
capabilities: provider_class.capabilities.capabilities_for(model_id),
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will not work. The entire point of assume_model_exists is that the model doesn't exist. Take a look at the array of possible capabilities in models_schema.json instead.

Comment on lines +84 to +86
model: 'gemini-pro',
provider: 'gemini',
assume_model_exists: true
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd rather use the Ollama provider for this one and a randomized name of a model.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants