Skip to main content

Flow trigger endpoints

Use the /run and /webhook endpoints to run flows.

To create, read, update, and delete flows, see Flow management endpoints.

Run flow

tip

Langflow automatically generates Python, JavaScript, and curl code snippets for the /v1/run/$FLOW_ID endpoint for all flows. For more information, see Generate API code snippets.

Execute a specified flow by ID or name. Flow IDs can be found on the code snippets on the API access pane or in a flow's URL.

The following example runs a Basic Prompting flow with flow parameters passed in the request body. This flow requires a chat input string (input_value), and uses default values for all other parameters.


_12
curl -X POST \
_12
"$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID" \
_12
-H "Content-Type: application/json" \
_12
-H "x-api-key: $LANGFLOW_API_KEY" \
_12
-d '{
_12
"input_value": "Tell me about something interesting!",
_12
"session_id": "chat-123",
_12
"input_type": "chat",
_12
"output_type": "chat",
_12
"output_component": "",
_12
"tweaks": null
_12
}'

The response from /v1/run/$FLOW_ID includes metadata, inputs, and outputs for the run.

Result

The following example illustrates a response from a Basic Prompting flow:


_29
{
_29
"session_id": "chat-123",
_29
"outputs": [{
_29
"inputs": {
_29
"input_value": "Tell me about something interesting!"
_29
},
_29
"outputs": [{
_29
"results": {
_29
"message": {
_29
"text": "Sure! Have you ever heard of the phenomenon known as \"bioluminescence\"? It's a fascinating natural occurrence where living organisms produce and emit light. This ability is found in various species, including certain types of jellyfish, fireflies, and deep-sea creatures like anglerfish.\n\nBioluminescence occurs through a chemical reaction in which a light-emitting molecule called luciferin reacts with oxygen, catalyzed by an enzyme called luciferase. The result is a beautiful glow that can serve various purposes, such as attracting mates, deterring predators, or luring prey.\n\nOne of the most stunning displays of bioluminescence can be seen in the ocean, where certain plankton emit light when disturbed, creating a mesmerizing blue glow in the water. This phenomenon is often referred to as \"sea sparkle\" and can be seen in coastal areas around the world.\n\nBioluminescence not only captivates our imagination but also has practical applications in science and medicine, including the development of biosensors and imaging techniques. It's a remarkable example of nature's creativity and complexity!",
_29
"sender": "Machine",
_29
"sender_name": "AI",
_29
"session_id": "chat-123",
_29
"timestamp": "2025-03-03T17:17:37+00:00",
_29
"flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201",
_29
"properties": {
_29
"source": {
_29
"id": "OpenAIModel-d1wOZ",
_29
"display_name": "OpenAI",
_29
"source": "gpt-4o-mini"
_29
},
_29
"icon": "OpenAI"
_29
},
_29
"component_id": "ChatOutput-ylMzN"
_29
}
_29
}
_29
}]
_29
}]
_29
}

If you are parsing the response in an application, you most likely need to extract the relevant content from the response, rather than pass the entire response back to the user. For an example of a script that extracts data from a Langflow API response, see the Quickstart.

Stream LLM token responses

With /v1/run/$FLOW_ID, the flow is executed as a batch with optional LLM token response streaming.

To stream LLM token responses, append the ?stream=true query parameter to the request:


_10
curl -X POST \
_10
"$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID?stream=true" \
_10
-H "accept: application/json" \
_10
-H "Content-Type: application/json" \
_10
-H "x-api-key: $LANGFLOW_API_KEY" \
_10
-d '{
_10
"message": "Tell me something interesting!",
_10
"session_id": "chat-123"
_10
}'

LLM chat responses are streamed back as token events, culminating in a final end event that closes the connection.

Result

The following example is truncated to illustrate a series of token events as well as the final end event that closes the LLM's token streaming response:


_19
{"event": "add_message", "data": {"timestamp": "2025-03-03T17:20:18", "sender": "User", "sender_name": "User", "session_id": "chat-123", "text": "Tell me about something interesting!", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": null, "display_name": null, "source": null}, "icon": "", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "0103a21b-ebf7-4c02-9d72-017fb297f812", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}}
_19
_19
{"event": "add_message", "data": {"timestamp": "2025-03-03T17:20:18", "sender": "Machine", "sender_name": "AI", "session_id": "chat-123", "text": "", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": "OpenAIModel-d1wOZ", "display_name": "OpenAI", "source": "gpt-4o-mini"}, "icon": "OpenAI", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "27b66789-e673-4c65-9e81-021752925161", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}}
_19
_19
{"event": "token", "data": {"chunk": " Have", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}
_19
_19
{"event": "token", "data": {"chunk": " you", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}
_19
_19
{"event": "token", "data": {"chunk": " ever", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}
_19
_19
{"event": "token", "data": {"chunk": " heard", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}
_19
_19
{"event": "token", "data": {"chunk": " of", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}
_19
_19
{"event": "token", "data": {"chunk": " the", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}
_19
_19
{"event": "token", "data": {"chunk": " phenomenon", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}
_19
_19
{"event": "end", "data": {"result": {"session_id": "chat-123", "message": "Sure! Have you ever heard of the phenomenon known as \"bioluminescence\"?..."}}}

Run endpoint headers

HeaderInfoExample
Content-TypeRequired. Specifies the JSON format."application/json"
acceptOptional. Specifies the response format."application/json"
x-api-keyOptional. Required only if authentication is enabled."sk-..."

Run endpoint parameters

ParameterTypeInfo
flow_idUUID/stringRequired. Part of URL: /run/$FLOW_ID
streambooleanOptional. Query parameter: /run/$FLOW_ID?stream=true
input_valuestringOptional. JSON body field. Main input text/prompt. Default: null
input_typestringOptional. JSON body field. Input type ("chat" or "text"). Default: "chat"
output_typestringOptional. JSON body field. Output type ("chat", "any", "debug"). Default: "chat"
output_componentstringOptional. JSON body field. Target component for output. Default: ""
tweaksobjectOptional. JSON body field. Component adjustments. Default: null
session_idstringOptional. JSON body field. Conversation context ID. See Session ID. Default: null

Request example with all headers and parameters


_17
curl -X POST \
_17
"$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID?stream=true" \
_17
-H "Content-Type: application/json" \
_17
-H "accept: application/json" \
_17
-H "x-api-key: $LANGFLOW_API_KEY" \
_17
-d '{
_17
"input_value": "Tell me a story",
_17
"input_type": "chat",
_17
"output_type": "chat",
_17
"output_component": "chat_output",
_17
"session_id": "chat-123",
_17
"tweaks": {
_17
"component_id": {
_17
"parameter_name": "value"
_17
}
_17
}
_17
}'

Webhook run flow

Use the /webhook endpoint to start a flow by sending an HTTP POST request.

tip

After you add a Webhook component to a flow, open the API access pane, and then click the Webhook cURL tab to get an automatically generated POST /webhook request for your flow.


_10
curl -X POST \
_10
"$LANGFLOW_SERVER_URL/api/v1/webhook/$FLOW_ID" \
_10
-H "Content-Type: application/json" \
_10
-H "x-api-key: $LANGFLOW_API_KEY" \
_10
-d '{"data": "example-data"}'

Result

_10
{
_10
"message": "Task started in the background",
_10
"status": "in progress"
_10
}

For more information, see Webhook component and Trigger flows with webhooks.

Deprecated flow trigger endpoints

The following endpoints are deprecated and replaced by the /run endpoint:

  • /process
  • /predict
Search