This endpoint requires a Connect-RPC client and cannot be tested with a simple HTTP request. See the Client SDKs page for implementation examples.
Overview
The Stream Chat endpoint allows you to receive chat responses in real-time as they are generated. This is a server-streaming Connect-RPC endpoint that works over standard HTTP/1.1 or HTTP/2. Endpoint:POST /v1/stream
Stream Flow
1
Initialize Client
Create a Connect-RPC transport with your API base URL and authentication token. Connect-RPC works over standard HTTP, so any HTTP client can be used with the Connect protocol.
2
Send Request
Send a
StreamRequest with your question, optional chat_id (for continuation), and tools configuration. If no chat_id is provided, a new chat session is created.3
Receive Initial Metadata
First
StreamResponse contains metadata with:chat_id: Use this for subsequent messages in the same conversationmodel: The LLM model being usedis_continuation: Whether this continues an existing chatstatus: InitiallyQUERY_STATUS_IN_PROGRESS
4
Stream Content
Receive multiple
StreamResponse messages as content is generated:meta_data: ChatId, modeltext: Incremental text deltas (append these to build the full response)preview: Results and visualizations (charts, tables)
Request Body
Response
The endpoint streamsStreamResponse messages containing:
- metadata: Chat metadata including ID, model, status, and token usage
- cell: Executable cells (SQL, Python, etc.)
- text: Text content deltas
- preview: Preview cells with results
StreamMetadata
| Field | Type | Description |
|---|---|---|
id | string | Message ID |
created_at | timestamp | Creation timestamp |
model | string | LLM model used |
chat_id | string | Chat session ID |
metadata | map | Additional metadata |
is_continuation | boolean | Whether this continues an existing chat |
usage | ChatUsage | Token usage statistics |
status | ChatStatus | Current status (IN_PROGRESS, COMPLETED, FAILED) |
error | string | Error message if failed |
ChatTools
| Field | Type | Description |
|---|---|---|
connector_ids | int32[] | Database connector IDs to use |
web_search_enabled | boolean | Enable web search |
sql_enabled | boolean | Enable SQL execution |
ontology_enabled | boolean | Enable ontology queries |
python_enabled | boolean | Enable Python code execution |
streamlit_enabled | boolean | Enable Streamlit apps |
google_drive_enabled | boolean | Enable Google Drive access |
powerbi_enabled | boolean | Enable Power BI access |
Notes
- This endpoint uses Connect-RPC server streaming over standard HTTP/1.1 or HTTP/2
- Connect-RPC is fully HTTP-compatible - no special gRPC infrastructure needed
- Responses are streamed incrementally as the AI generates content
- The stream will remain open until the query completes, fails, or is cancelled
- Use the Cancel Stream endpoint to stop an in-progress query