Skip to content

[fix] use openai model provider as default #644

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

jneeee
Copy link

@jneeee jneeee commented May 3, 2025

This commit fix the error:

When set default openai client but not use litellm provider, would got unknown prefix error.

Another reason is many of the endpoints are already compatible with the OpenAI SDK/Openai api, I want use them without install litellm.

client = AsyncOpenAI(
    base_url="https://integrate.api.nvidia.com/v1",
    api_key=os.getenv("NV_API_KEY"),
)
set_default_openai_client(client=client, use_for_tracing=False)
...
  agent = Agent(
      name="Assistant",
      instructions="You only respond in haikus.",
      model="meta/llama-4-scout-17b-16e-instruct",
      tools=[get_weather],
  )

Got Error:

openai-agents-python/src/agents/models/multi_provider.py", line 117, in _create_fallback_provider
    raise UserError(f"Unknown prefix: {prefix}")
agents.exceptions.UserError: Unknown prefix: meta

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant