-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
fix: adding trace i/o in langfuse openai integration #532
Conversation
langfuse/openai.py
Outdated
@@ -325,6 +330,8 @@ async def _get_langfuse_data_from_async_streaming_response( | |||
resource, responses | |||
) | |||
|
|||
langfuse.trace(id=generation.trace_id, output=completion) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In these cases, we do not know anymore whether the trace was generated by the user becore and whether he provided the traceId himself. Can we somehow get that information here and only update in this case? Similar to how we do that with the input.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have modified this approach to get the is_nested_trace
bool value to identify the state of the trace. If the trace is freshly created (i.e) if it is not nested, then we modify the input and the output.
@@ -505,7 +530,7 @@ async def _wrap_async( | |||
start_time = _get_timestamp() | |||
arg_extractor = OpenAiArgsExtractor(*args, **kwargs) | |||
|
|||
generation = _get_langfuse_data_from_kwargs( | |||
generation, is_nested_trace = _get_langfuse_data_from_kwargs( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Was wondering if we could return a nested function bool from the existing function which would make the implementation easier to review and also possibly reduce the touch points in the code. If we change, we change on ly inside the function and rest should fall in place.
Implementation looks good to me. Could you add a test to the openai test function ensuring we only add io in the case discussed? |
…ngfuse/langfuse-python into noble-varghese/fix-trace-input-output
…r in the input to openai. In that case the integration is expected to add the generation in the trace and not update the i/o
@maxdeichmann I have added a test for checking the scenario we discussed where the user provides a trace_id in the openai integration as well as updated other tests to verify the trace i/o to be validated again input and out to and from model. |
Description
The LangFuse OpenAI integration previously failed to include the input and output data in the trace body, limiting it only to the generation.
Key Changes: