You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Doing further work on #1610, I've noticed a completely different behavior between v1.0.0 and 1.1.0, when working with LangGraph streams. The stream consists of chunks of type dict that a LangGraph agent produces. In version 1.0.0, the stream appeared in the chat GUI as expected, appearing message by message. In version 1.1.0, the stream only appeared after full completion, all messages rendered in the GUI simultaneously. Here is a (hopefully) reproducible example:
# my_agent.py### DEFINE THE GRAPH ###importoperatorfromtypingimportAnnotated, Anyfromtyping_extensionsimportTypedDictimporttimefromlanggraph.graphimportStateGraph, START, ENDclassState(TypedDict):
# The operator.add reducer fn makes this append-onlyaggregate: Annotated[list, operator.add]
defnode_a(state):
time.sleep(3)
return {"aggregate": ["I'm A"]}
defnode_b(state):
time.sleep(5)
return {"aggregate": ["I'm B"]}
defnode_c(state):
time.sleep(5)
return {"aggregate": ["I'm C"]}
defnode_d(state):
time.sleep(5)
return {"aggregate": ["I'm D"]}
builder=StateGraph(State)
builder.add_node("a", node_a)
builder.add_edge(START, "a")
builder.add_node("b", node_b)
builder.add_node("c", node_c)
builder.add_node("d", node_d)
builder.add_edge("a", "b")
builder.add_edge("a", "c") # Nodes B and C should appear together by designbuilder.add_edge("b", "d")
builder.add_edge("c", "d")
builder.add_edge("d", END)
graph=builder.compile()
# app.pyfromshiny.expressimportuiimportmy_agent# Set some Shiny page optionsui.page_opts(
title="Hello LangChain Chat Models",
fillable=True,
fillable_mobile=True,
)
# Create and display an empty chat UIchat=ui.Chat(id="chat")
chat.ui()
# Define a callback to run when the user submits a message@chat.on_user_submitasyncdef_():
# Get messages currently in the chatmessages=chat.messages(format="langchain")
# Create a response message streamasyncforchunkinmy_agent.graph.astream({"aggregate": []}, stream_mode="updates"):
assertlen(chunk.keys()) ==1, 'Got multiple keys from the stream :/'forkey, valueinchunk.items():
print(chunk)
output=chunk.get(key).get('aggregate')[0]
awaitchat.append_message_stream(output)
In practice, this renders working with LangGraph agents unfeasible, requiring to downgrade.
The text was updated successfully, but these errors were encountered:
Doing further work on #1610, I've noticed a completely different behavior between v1.0.0 and 1.1.0, when working with LangGraph streams. The stream consists of chunks of type
dict
that a LangGraph agent produces. In version 1.0.0, the stream appeared in the chat GUI as expected, appearing message by message. In version 1.1.0, the stream only appeared after full completion, all messages rendered in the GUI simultaneously. Here is a (hopefully) reproducible example:In practice, this renders working with LangGraph agents unfeasible, requiring to downgrade.
The text was updated successfully, but these errors were encountered: