Skip to main content
This endpoint requires a Connect-RPC client and cannot be tested with a simple HTTP request. See the Client SDKs page for implementation examples.

Overview

The Stream Chat endpoint allows you to receive chat responses in real-time as they are generated. This is a server-streaming Connect-RPC endpoint that works over standard HTTP/1.1 or HTTP/2. Endpoint: POST /v1/stream

Stream Flow

1

Initialize Client

Create a Connect-RPC transport with your API base URL and authentication token. Connect-RPC works over standard HTTP, so any HTTP client can be used with the Connect protocol.
2

Send Request

Send a StreamRequest with your question, optional chat_id (for continuation), and tools configuration. If no chat_id is provided, a new chat session is created.
3

Receive Initial Metadata

First StreamResponse contains metadata with:
  • chat_id: Use this for subsequent messages in the same conversation
  • model: The LLM model being used
  • is_continuation: Whether this continues an existing chat
  • status: Initially QUERY_STATUS_IN_PROGRESS
4

Stream Content

Receive multiple StreamResponse messages as content is generated:
  • meta_data: ChatId, model
  • text: Incremental text deltas (append these to build the full response)
  • preview: Results and visualizations (charts, tables)

Request Body

{
  "question": "Your question here",
  "chat_id": "optional-chat-id",
  "tools": {
    "connector_ids": [1, 2],
    "web_search_enabled": true,
    "sql_enabled": true,
    "ontology_enabled": false,
    "python_enabled": true,
    "streamlit_enabled": false,
    "google_drive_enabled": false,
    "powerbi_enabled": false
  }
}

Response

The endpoint streams StreamResponse messages containing:
  • metadata: Chat metadata including ID, model, status, and token usage
  • cell: Executable cells (SQL, Python, etc.)
  • text: Text content deltas
  • preview: Preview cells with results

StreamMetadata

FieldTypeDescription
idstringMessage ID
created_attimestampCreation timestamp
modelstringLLM model used
chat_idstringChat session ID
metadatamapAdditional metadata
is_continuationbooleanWhether this continues an existing chat
usageChatUsageToken usage statistics
statusChatStatusCurrent status (IN_PROGRESS, COMPLETED, FAILED)
errorstringError message if failed

ChatTools

FieldTypeDescription
connector_idsint32[]Database connector IDs to use
web_search_enabledbooleanEnable web search
sql_enabledbooleanEnable SQL execution
ontology_enabledbooleanEnable ontology queries
python_enabledbooleanEnable Python code execution
streamlit_enabledbooleanEnable Streamlit apps
google_drive_enabledbooleanEnable Google Drive access
powerbi_enabledbooleanEnable Power BI access

Notes

  • This endpoint uses Connect-RPC server streaming over standard HTTP/1.1 or HTTP/2
  • Connect-RPC is fully HTTP-compatible - no special gRPC infrastructure needed
  • Responses are streamed incrementally as the AI generates content
  • The stream will remain open until the query completes, fails, or is cancelled
  • Use the Cancel Stream endpoint to stop an in-progress query