AI Integration

Streaming Agent State from Python to React in Real Time with AG-UI

Apr 14, 2026 7 min read

ShopAgent’s UI feels live—product cards stream in, carts update instantly, and checkout confirmations appear without a single poll or WebSocket. This article shows how AG-UI streams LangGraph state from Python to a Next.js frontend over Server-Sent Events, and how a small React hook keeps everything in sync.

How LangGraph state — cart items, product cards, checkout session, errors — reaches a Next.js frontend in real time via Server-Sent Events, without polling or page reloads.

The shopping experience in ShopAgent is live. When the agent searches for products, cards appear as the node completes. When a user adds an item to the cart, the cart panel updates immediately. When checkout completes, the confirmation appears without a page reload.

None of this uses polling. There is no WebSocket. The entire real-time state sync between a Python LangGraph orchestrator and a Next.js React frontend is handled by AG-UI (Agent-Generated UI) over Server-Sent Events.

This post explains how it works end to end — from the LangGraph state update to the React component re-render.

WHAT AG-UI IS

 

AG-UI is a protocol that defines how an AI agent streams its state and messages to a frontend in real time. It sits on top of Server-Sent Events (SSE) — the browser's native mechanism for receiving a stream of events from a server without keeping a WebSocket open.

The protocol defines event types:

  • text_message_start / text_message_content / text_message_end — for streaming the LLM's text response word by word
  • state_snapshot — for broadcasting the full agent state after each node update
  • run_started / run_finished — lifecycle events

ShopAgent uses CopilotKit as the AG-UI runtime. CopilotKit wraps LangGraph's astream_events and maps LangChain events to the AG-UI protocol, then streams them to the frontend.

THE SSE STREAM: FROM LANGGRAPH TO THE BROWSER

 

When a user sends a message, the browser makes a POST request to /api/copilotkit/agent/shopAgent/run. The response is a text/event-stream — a long-lived HTTP response that pushes events as they happen.

The Python orchestrator processes the request:

# main.py — simplified
async for event in compiled_graph.astream_events(
    {"messages": [human_message]},
    config={"configurable": {"thread_id": thread_id}},
    version="v2",
):
    # CopilotKit maps LangChain events to AG-UI protocol events
    # and writes them to the SSE stream
    await copilotkit_emit_state(config, state)

Each time a LangGraph node completes and updates state, CopilotKit emits a state_snapshot event containing the full current state. The browser receives this and the frontend hook processes it.

STATE SNAPSHOTS: THE MECHANISM

 

A state_snapshot event looks like this on the wire:

data: {"type":"StateDeltaEvent","threadId":"abc123","delta":{"cart":[...],"product_candidates":[...],"a2ui_payload":{...}}}

It contains the delta — the fields that changed in the last node run. The frontend receives this and merges it into the local state.

The full agent state includes everything the frontend needs:

  • cart — the current cart items with prices and quantities
  • product_candidates — the enriched product list from the last search
  • checkout_session — the active UCP session, or null
  • a2ui_payload — the latest A2UI structured UI description
  • errors — any ProtocolError objects from the current run
  • next_action — which node is currently executing (shown in the loading indicator)
  • recommender_available / inventory_available — service health flags

All of this comes from the Python state. The frontend does not compute any of it — it receives, renders, and updates.

THE FRONTEND HOOK

 

On the React side, a single hook gives access to the streamed agent state:

const { agent } = useAgent({ agentId: "shopAgent" });
const isLoading = agent.isRunning ?? false;

agent.state is populated reactively as state_snapshot events arrive. It starts as {} and is progressively updated as nodes complete.

agent.isRunning is true while the SSE stream is open. The loading indicator and disabled input are driven by this flag.

ShopAgent wraps the raw agent.state in a useEffect that merges updates rather than replacing wholesale:

useEffect(() => {
    const s = agent.state as Partial<AgentState> | null | undefined;
    if (!s || Object.keys(s).length === 0) return;

    setDisplayState(prev => ({
        ...prev,
        product_candidates:
            s.product_candidates?.length > 0
                ? s.product_candidates
                : prev.product_candidates,
        cart: s.cart !== undefined ? s.cart : prev.cart,
        checkout_session: s.checkout_session !== undefined
            ? s.checkout_session : prev.checkout_session,
        // ... other fields
    }));
}, [agent.state]);

MERGING STATE UPDATES — AVOIDING RACE CONDITIONS

 

There is a known issue in CopilotKit's AG-UI implementation: when a new run starts, agent.state is temporarily reset to {} before the first state_snapshot arrives. If the frontend replaces its state wholesale on each update, the product cards disappear for a fraction of a second every time any action runs — including adding to cart.

The merge approach above avoids this. The rule for product_candidates is: only overwrite if the new value is non-empty. An empty array from the reset does not wipe out a good search result. The rule for cart is: always update when defined, because an empty cart after order completion is a meaningful state, not a reset artifact.

Similarly, the checkout_session is only overwritten if the incoming value is explicitly defined — undefined means the snapshot does not include session data (not that the session was cleared).

This pattern — merge rather than replace, with field-specific rules — is essential for a smooth user experience when state updates are frequent and partial.

WHY THE CLASSIFICATION NODE USES THE OPENAI SDK DIRECTLY

 

The classify_intent node uses AsyncOpenAI directly instead of LangChain:

_classify_client = AsyncOpenAI()

completion = await _classify_client.beta.chat.completions.parse(
    model="gpt-4o",
    messages=oai_messages,
    response_format=ClassifyIntentResult,
    temperature=0,
)

The reason is AG-UI. CopilotKit's runtime intercepts all LangChain events and streams them as AG-UI events. If the classification call used LangChain's with_structured_output(), the raw JSON response — {"actions": [{"intent": "search_products", "query": "running shoes"}]} — would appear in the chat panel as an assistant message before the actual product list response.

The OpenAI SDK call bypasses LangChain's event pipeline entirely. The classification is invisible to the user. Only the respond node's LangChain call is captured and streamed.

THE COPILOT RUNTIME PROXY

 

The Next.js frontend does not call the Python orchestrator directly. All /api/copilotkit/* requests go to a Next.js route handler that proxies to the orchestrator:

// frontend/src/app/api/copilotkit/[[...path]]/route.ts
const runtime = new CopilotRuntime({
    agents: {
        shopAgent: new LangGraphHttpAgent({
            url: `${orchestratorUrl}/copilotkit`,
        }),
    },
});

The LangGraphHttpAgent connects to the Python orchestrator's /copilotkit endpoint. The CopilotKit runtime handles the AG-UI protocol translation — it takes the orchestrator's LangGraph events and produces the SSE stream the browser consumes.

This proxy architecture means the orchestrator does not need to implement the full AG-UI protocol directly. It runs LangGraph. CopilotKit translates.

WHAT WE'D DO DIFFERENTLY

 

Selective state streaming — Currently, the full agent state is included in every state_snapshot. For a production system with large product lists or order histories, this adds unnecessary payload size. Streaming only the fields that changed would reduce bandwidth.

Optimistic UI updates — The cart currently updates after the add_to_cart node returns and a state_snapshot arrives. For snappier UX, the frontend could update the cart optimistically when the button is clicked, then reconcile with the actual state when the snapshot arrives. This is a standard React pattern.

Persistent connection handling — If the SSE connection drops (network interruption, tab backgrounding), the current implementation does not automatically reconnect. CopilotKit handles reconnection in newer versions, but an explicit reconnect strategy with exponential backoff would make the experience more resilient.

THE TAKEAWAY

 

AG-UI eliminates the polling loop and the custom WebSocket layer that real-time agent UIs typically require. Every LangGraph state update — after each node completes — becomes an SSE event. The frontend receives it, merges it, and React renders the change. The developer writes Python nodes and TypeScript components. The protocol handles the bridge.

The merge-not-replace pattern in the frontend state hook is not obvious but essential. Without it, partial state snapshots would clear UI elements that the user can see, creating a jarring experience every time any background action runs.

 

The ShopAgent demo is live at https://shop-agent.agilecreativeminds.nl. See the demo showcase or follow the demo walkthrough. Built by Agile Creative Minds.