Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Add LLM-assisted retries #288

Merged
merged 48 commits into from
Aug 12, 2024
Merged

Add LLM-assisted retries #288

merged 48 commits into from
Aug 12, 2024

Conversation

jackmpcollins
Copy link
Owner

@jackmpcollins jackmpcollins commented Aug 1, 2024

Add max_retries param to prompt and chatprompt decorators for retries using the LLM to fix its own errors.

Example of using Pydantic validator to reject an LLM output and have it retried.

from typing import Annotated

from magentic import OpenaiChatModel, UserMessage, prompt
from magentic.chat_model.message import AssistantMessage, Message
from pydantic import AfterValidator, BaseModel


def assert_is_ireland(v: str) -> str:
    assert v == "Ireland", "Country must be Ireland"
    return v


class Country(BaseModel):
    name: Annotated[str, AfterValidator(assert_is_ireland)]
    capital: str


@prompt("Return a country", max_retries=3)
def get_country() -> Country: ...


get_country()
07:13:45.872 Calling prompt-function get_country
07:13:45.888   LLM-assisted retries enabled. Max 3
07:13:45.935     Chat Completion with 'gpt-4o' [LLM]
07:13:46.397     streaming response from 'gpt-4o' took 0.14s [LLM]
07:13:46.399     Retrying Chat Completion. Attempt 1
07:13:46.411     Chat Completion with 'gpt-4o' [LLM]
07:13:46.938     streaming response from 'gpt-4o' took 0.16s [LLM]

Country(name='Ireland', capital='Dublin')

Closes #166
Related to #210

@jackmpcollins jackmpcollins self-assigned this Aug 1, 2024
@jackmpcollins jackmpcollins marked this pull request as ready for review August 12, 2024 05:58
@jackmpcollins jackmpcollins merged commit c7cb858 into main Aug 12, 2024
2 checks passed
@jackmpcollins jackmpcollins deleted the add-retries-against-llm branch August 12, 2024 06:06
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Retry on failure to parse LLM output
1 participant