Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
105 changes: 83 additions & 22 deletions src/langsmith/use-stream-react.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ sidebarTitle: Integrate LangGraph into your React application
* [Agent Server](/langsmith/agent-server)
</Info>

The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) React hook provides a seamless way to integrate LangGraph into your React applications. It handles all the complexities of streaming, state management, and branching logic, letting you focus on building great chat experiences.
The [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) React hook provides a seamless way to integrate LangGraph into your React applications. It handles all the complexities of streaming, state management, and branching logic, letting you focus on building great chat experiences.

Key features:

Expand All @@ -17,11 +17,11 @@ Key features:
* Conversation branching: Create alternate conversation paths from any point in the chat history
* UI-agnostic design: bring your own components and styling

Let's explore how to use [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) in your React application.
Let's explore how to use [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) in your React application.

The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) provides a solid foundation for creating bespoke chat experiences. For pre-built chat components and interfaces, we also recommend checking out [CopilotKit](https://docs.copilotkit.ai/coagents/quickstart/langgraph) and [assistant-ui](https://www.assistant-ui.com/docs/runtimes/langgraph).
The [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) provides a solid foundation for creating bespoke chat experiences. For pre-built chat components and interfaces, we also recommend checking out [CopilotKit](https://docs.copilotkit.ai/coagents/quickstart/langgraph) and [assistant-ui](https://www.assistant-ui.com/docs/runtimes/langgraph).

## Installation
## Install the SDK

```bash
npm install @langchain/langgraph-sdk @langchain/core
Expand Down Expand Up @@ -76,9 +76,9 @@ export default function App() {
}
```

## Customizing Your UI
## Customizing your UI

The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) hook takes care of all the complex state management behind the scenes, providing you with simple interfaces to build your UI. Here's what you get out of the box:
The [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) hook takes care of all the complex state management behind the scenes, providing you with simple interfaces to build your UI. Here's what you get out of the box:

* Thread state management
* Loading and error states
Expand All @@ -88,7 +88,7 @@ The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules

Here are some examples on how to use these features effectively:

### Loading States
### Loading states

The `isLoading` property tells you when a stream is active, enabling you to:

Expand Down Expand Up @@ -118,7 +118,7 @@ export default function App() {

### Resume a stream after page refresh

The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) hook can automatically resume an ongoing run upon mounting by setting `reconnectOnMount: true`. This is useful for continuing a stream after a page refresh, ensuring no messages and events generated during the downtime are lost.
The [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) hook can automatically resume an ongoing run upon mounting by setting `reconnectOnMount: true`. This is useful for continuing a stream after a page refresh, ensuring no messages and events generated during the downtime are lost.

```tsx
const thread = useStream<{ messages: Message[] }>({
Expand Down Expand Up @@ -225,7 +225,7 @@ function useSearchParam(key: string) {
}
```

### Thread Management
### Thread management

Keep track of conversations with built-in thread management. You can access the current thread ID and get notified when new threads are created:

Expand All @@ -243,9 +243,9 @@ const thread = useStream<{ messages: Message[] }>({

We recommend storing the `threadId` in your URL's query parameters to let users resume conversations after page refreshes.

### Messages Handling
### Messages handling

The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) hook will keep track of the message chunks received from the server and concatenate them together to form a complete message. The completed message chunks can be retrieved via the `messages` property.
The [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) hook will keep track of the message chunks received from the server and concatenate them together to form a complete message. The completed message chunks can be retrieved via the `messages` property.

By default, the `messagesKey` is set to `messages`, where it will append the new messages chunks to `values["messages"]`. If you store messages in a different key, you can change the value of `messagesKey`.

Expand All @@ -270,11 +270,32 @@ export default function HomePage() {
}
```

Under the hood, the [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) hook will use the `streamMode: "messages-tuple"` to receive a stream of messages (i.e. individual LLM tokens) from any LangChain chat model invocations inside your graph nodes. Learn more about messages streaming in the [streaming](/langsmith/streaming#messages) guide.
Under the hood, [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) automatically subscribes to multiple [stream modes](/langsmith/streaming#supported-stream-modes) to provide a complete picture of your graph's execution. The `messages` property specifically uses `messages-tuple` mode to receive individual LLM tokens from chat model invocations. Learn more about messages streaming in the [streaming](/langsmith/streaming#messages) guide.

### Accessing full graph state

Beyond messages, you can access the complete graph state via the `values` property. This includes any state your graph maintains, not just the conversation history:

```tsx
const thread = useStream<{ messages: Message[]; context: string; metadata: Record<string, unknown> }>({
apiUrl: "http://localhost:2024",
assistantId: "agent",
messagesKey: "messages",
});

// Access the full state
console.log(thread.values);
// { messages: [...], context: "...", metadata: {...} }

// Or access specific state keys
const context = thread.values?.context;
```

This is powered by the `values` stream mode under the hood, which streams the full state after each graph step.

### Interrupts

The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) hook exposes the `interrupt` property, which will be filled with the last interrupt from the thread. You can use interrupts to:
The [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) hook exposes the `interrupt` property, which will be filled with the last interrupt from the thread. You can use interrupts to:

* Render a confirmation UI before executing a node
* Wait for human input, allowing agent to ask the user with clarifying questions
Expand Down Expand Up @@ -533,7 +554,7 @@ const CachedThreadExample = ({ threadId, cachedThreadData }) => {
};
```

### Optimistic Thread Creation
### Optimistic thread creation

Use the `threadId` option in `submit` function to enable optimistic UI patterns where you need to know the thread ID before the thread is actually created.

Expand Down Expand Up @@ -576,7 +597,7 @@ const OptimisticThreadExample = () => {

### TypeScript

The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) hook is friendly for apps written in TypeScript and you can specify types for the state to get better type safety and IDE support.
The [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) hook is friendly for apps written in TypeScript and you can specify types for the state to get better type safety and IDE support.

```tsx
// Define your types
Expand Down Expand Up @@ -651,14 +672,54 @@ const thread = useStream<

## Event Handling

The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) hook provides several callback options to help you respond to different events:
The [`useStream()`](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html) hook provides callback options that give you access to different types of streaming events beyond just messages. You don't need to explicitly configure stream modes— just pass callbacks for the event types you want to handle:

```tsx
const thread = useStream<{ messages: Message[] }>({
apiUrl: "http://localhost:2024",
assistantId: "agent",
messagesKey: "messages",

// Handle state updates after each graph step
onUpdateEvent: (update, options) => {
console.log("Graph update:", update);
// Access which node produced this update, the new state values, etc.
},

// Handle custom events streamed from your graph
onCustomEvent: (event, options) => {
console.log("Custom event:", event);
// React to progress updates, debug info, or any custom data
},

// Handle metadata events with run/thread info
onMetadataEvent: (metadata) => {
console.log("Run ID:", metadata.run_id);
console.log("Thread ID:", metadata.thread_id);
},

onError: (error) => {
console.error("Stream error:", error);
},

onFinish: (state, options) => {
console.log("Stream finished with final state:", state);
},
});
```

### Available callbacks

| Callback | Description | Stream mode |
|----------|-------------|-------------|
| `onUpdateEvent` | Called when a state update is received after each graph step | `updates` |
| `onCustomEvent` | Called when a custom event is received from your graph. See the [streaming](/oss/langgraph/streaming#stream-custom-data) guide. | `custom` |
| `onMetadataEvent` | Called with run and thread metadata | `metadata` |
| `onError` | Called when an error occurs | - |
| `onFinish` | Called when the stream completes | - |

* `onError`: Called when an error occurs.
* `onFinish`: Called when the stream is finished.
* `onUpdateEvent`: Called when an update event is received.
* `onCustomEvent`: Called when a custom event is received. See the [streaming](/oss/langgraph/streaming#stream-custom-data) guide to learn how to stream custom events.
* `onMetadataEvent`: Called when a metadata event is received, which contains the Run ID and Thread ID.
This design means you can access rich streaming data (state updates, custom events, metadata) without manually configuring stream modes—`useStream` handles the subscription for you.

## Learn More

* [JS/TS SDK Reference](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html)
* [useStream API Reference](https://reference.langchain.com/javascript/functions/_langchain_langgraph-sdk.react.useStream.html)