v0.28.0
What's Changed
🪵 🔥 Logfire / OpenTelemetry now supported!
This makes it much easier to follow what tool calls are being made by the LLM both as printed output locally and in Logfire or another monitoring service. It also lets you see the raw requests being sent to OpenAI/Anthropic so you can more easily debug issues.
All it takes to get set up is
pip install logfire
import logfire
logfire.configure(send_to_logfire=False) # Or True to use the Logfire service
logfire.instrument_openai() # optional, to trace OpenAI API calls
# logfire.instrument_anthropic() # optional, to trace Anthropic API calls
Check out the new docs page: https://magentic.dev/logging-and-tracing/
PRs
Add basic logging and MAGENTIC_VERBOSE env var by @jackmpcollins in #263- Update dependencies by @jackmpcollins in #264
- Instrument for Logfire / OpenTelemetry by @jackmpcollins in #265
- Do not set stream_options when using AzureOpenAI by @jackmpcollins in #262
- Use new
parallel_tool_calls
arg with OpenAI API by @jackmpcollins in #267 - Fix LitellmChatModel tool_choice parameter to force Anthropic tool use by @jackmpcollins in #268
Full Changelog: v0.27.0...v0.28.0