Flow trigger endpoints
Use the /run and /webhook endpoints to run flows.
To create, read, update, and delete flows, see Flow management endpoints.
Run flow
Langflow automatically generates Python, JavaScript, and curl code snippets for the /v1/run/$FLOW_ID endpoint for all flows.
For more information, see Generate API code snippets.
Execute a specified flow by ID or name. Flow IDs can be found on the code snippets on the API access pane or in a flow's URL.
The following example runs the Basic Prompting template flow with flow parameters passed in the request body.
This flow requires a chat input string (input_value), and uses default values for all other parameters.
- Python
- JavaScript
- curl
_25import requests_25_25url = "http://LANGFLOW_SERVER_URL/api/v1/run/FLOW_ID"_25_25# Request payload_25payload = {_25 "input_value": "Tell me about something interesting!",_25 "session_id": "chat-123",_25 "input_type": "chat",_25 "output_type": "chat",_25 "output_component": ""_25}_25_25# Request headers_25headers = {_25 "Content-Type": "application/json",_25 "x-api-key": "LANGFLOW_API_KEY"_25}_25_25try:_25 response = requests.post(url, json=payload, headers=headers)_25 response.raise_for_status()_25 print(response.json())_25except requests.exceptions.RequestException as e:_25 print(f"Error making API request: {e}")
_21const payload = {_21 input_value: "Tell me about something interesting!",_21 session_id: "chat-123",_21 input_type: "chat",_21 output_type: "chat",_21 output_component: ""_21};_21_21const options = {_21 method: 'POST',_21 headers: {_21 'Content-Type': 'application/json',_21 'x-api-key': 'LANGFLOW_API_KEY'_21 },_21 body: JSON.stringify(payload)_21};_21_21fetch('http://LANGFLOW_SERVER_URL/api/v1/run/FLOW_ID', options)_21 .then(response => response.json())_21 .then(data => console.log(data))_21 .catch(err => console.error(err));
_11curl -X POST \_11 "$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID" \_11 -H "Content-Type: application/json" \_11 -H "x-api-key: $LANGFLOW_API_KEY" \_11 -d '{_11 "input_value": "Tell me about something interesting!",_11 "session_id": "chat-123",_11 "input_type": "chat",_11 "output_type": "chat",_11 "output_component": ""_11 }'
The response from /v1/run/$FLOW_ID includes metadata, inputs, and outputs for the run.
Result
The following example illustrates a response from a Basic Prompting flow:
_29{_29 "session_id": "chat-123",_29 "outputs": [{_29 "inputs": {_29 "input_value": "Tell me about something interesting!"_29 },_29 "outputs": [{_29 "results": {_29 "message": {_29 "text": "Sure! Have you ever heard of the phenomenon known as \"bioluminescence\"? It's a fascinating natural occurrence where living organisms produce and emit light. This ability is found in various species, including certain types of jellyfish, fireflies, and deep-sea creatures like anglerfish.\n\nBioluminescence occurs through a chemical reaction in which a light-emitting molecule called luciferin reacts with oxygen, catalyzed by an enzyme called luciferase. The result is a beautiful glow that can serve various purposes, such as attracting mates, deterring predators, or luring prey.\n\nOne of the most stunning displays of bioluminescence can be seen in the ocean, where certain plankton emit light when disturbed, creating a mesmerizing blue glow in the water. This phenomenon is often referred to as \"sea sparkle\" and can be seen in coastal areas around the world.\n\nBioluminescence not only captivates our imagination but also has practical applications in science and medicine, including the development of biosensors and imaging techniques. It's a remarkable example of nature's creativity and complexity!",_29 "sender": "Machine",_29 "sender_name": "AI",_29 "session_id": "chat-123",_29 "timestamp": "2025-03-03T17:17:37+00:00",_29 "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201",_29 "properties": {_29 "source": {_29 "id": "OpenAIModel-d1wOZ",_29 "display_name": "OpenAI",_29 "source": "gpt-4o-mini"_29 },_29 "icon": "OpenAI"_29 },_29 "component_id": "ChatOutput-ylMzN"_29 }_29 }_29 }]_29 }]_29}
If you are parsing the response in an application, you most likely need to extract the relevant content from the response, rather than pass the entire response back to the user. For an example of a script that extracts data from a Langflow API response, see the Quickstart.
Stream LLM token responses
With /v1/run/$FLOW_ID, the flow is executed as a batch with optional LLM token response streaming.
To stream LLM token responses, append the ?stream=true query parameter to the request:
- Python
- JavaScript
- curl
_27import requests_27_27url = "http://LANGFLOW_SERVER_URL/api/v1/run/FLOW_ID?stream=true"_27_27# Request payload_27payload = {_27 "message": "Tell me something interesting!",_27 "session_id": "chat-123"_27}_27_27# Request headers_27headers = {_27 "accept": "application/json",_27 "Content-Type": "application/json",_27 "x-api-key": "LANGFLOW_API_KEY"_27}_27_27try:_27 response = requests.post(url, json=payload, headers=headers, stream=True)_27 response.raise_for_status()_27_27 # Process streaming response_27 for line in response.iter_lines():_27 if line:_27 print(line.decode('utf-8'))_27except requests.exceptions.RequestException as e:_27 print(f"Error making API request: {e}")
_29const payload = {_29 message: "Tell me something interesting!",_29 session_id: "chat-123"_29};_29_29const options = {_29 method: 'POST',_29 headers: {_29 'accept': 'application/json',_29 'Content-Type': 'application/json',_29 'x-api-key': 'LANGFLOW_API_KEY'_29 },_29 body: JSON.stringify(payload)_29};_29_29fetch('http://LANGFLOW_SERVER_URL/api/v1/run/FLOW_ID?stream=true', options)_29 .then(async response => {_29 const reader = response.body?.getReader();_29 const decoder = new TextDecoder();_29_29 if (reader) {_29 while (true) {_29 const { done, value } = await reader.read();_29 if (done) break;_29 console.log(decoder.decode(value));_29 }_29 }_29 })_29 .catch(err => console.error(err));
_10curl -X POST \_10 "$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID?stream=true" \_10 -H "accept: application/json" \_10 -H "Content-Type: application/json" \_10 -H "x-api-key: $LANGFLOW_API_KEY" \_10 -d '{_10 "message": "Tell me something interesting!",_10 "session_id": "chat-123"_10 }'
LLM chat responses are streamed back as token events, culminating in a final end event that closes the connection.
Result
The following example is truncated to illustrate a series of token events as well as the final end event that closes the LLM's token streaming response:
_19{"event": "add_message", "data": {"timestamp": "2025-03-03T17:20:18", "sender": "User", "sender_name": "User", "session_id": "chat-123", "text": "Tell me about something interesting!", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": null, "display_name": null, "source": null}, "icon": "", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "0103a21b-ebf7-4c02-9d72-017fb297f812", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}}_19_19{"event": "add_message", "data": {"timestamp": "2025-03-03T17:20:18", "sender": "Machine", "sender_name": "AI", "session_id": "chat-123", "text": "", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": "OpenAIModel-d1wOZ", "display_name": "OpenAI", "source": "gpt-4o-mini"}, "icon": "OpenAI", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "27b66789-e673-4c65-9e81-021752925161", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}}_19_19{"event": "token", "data": {"chunk": " Have", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}_19_19{"event": "token", "data": {"chunk": " you", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}_19_19{"event": "token", "data": {"chunk": " ever", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}_19_19{"event": "token", "data": {"chunk": " heard", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}_19_19{"event": "token", "data": {"chunk": " of", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}_19_19{"event": "token", "data": {"chunk": " the", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}_19_19{"event": "token", "data": {"chunk": " phenomenon", "id": "27b66789-e673-4c65-9e81-021752925161", "timestamp": "2025-03-03 17:20:18 UTC"}}_19_19{"event": "end", "data": {"result": {"session_id": "chat-123", "message": "Sure! Have you ever heard of the phenomenon known as \"bioluminescence\"?..."}}}
Run endpoint headers
| Header | Info | Example |
|---|---|---|
| Content-Type | Required. Specifies the JSON format. | "application/json" |
| accept | Optional. Specifies the response format. Defaults to JSON if not specified. | "application/json" |
| x-api-key | Required. Your Langflow API key for authentication. Can be passed as a header or query parameter. | "sk-..." |
X-LANGFLOW-GLOBAL-VAR-* | Optional. Pass global variables to the flow. Variable names are automatically converted to uppercase. These variables take precedence over OS environment variables and are only available during this specific request execution. | "X-LANGFLOW-GLOBAL-VAR-API_KEY: sk-..." |
Run endpoint parameters
| Parameter | Type | Info |
|---|---|---|
| flow_id | UUID/string | Required. Part of URL: /run/$FLOW_ID |
| stream | Boolean | Optional. Query parameter: /run/$FLOW_ID?stream=true |
| input_value | string | Optional. JSON body field. Main input text/prompt. Default: null |
| input_type | string | Optional. JSON body field. Input type ("chat" or "text"). Default: "chat" |
| output_type | string | Optional. JSON body field. Output type ("chat", "any", "debug"). Default: "chat" |
| output_component | string | Optional. JSON body field. Target component for output. Default: "" |
| tweaks | object | Optional. JSON body field. Component adjustments. Default: null |
| session_id | string | Optional. JSON body field. Conversation context ID. See Session ID. Default: null |
Request example with all headers and parameters
- Python
- JavaScript
- curl
_35import requests_35_35url = "http://LANGFLOW_SERVER_URL/api/v1/run/FLOW_ID?stream=true"_35_35# Request payload with tweaks_35payload = {_35 "input_value": "Tell me a story",_35 "input_type": "chat",_35 "output_type": "chat",_35 "output_component": "chat_output",_35 "session_id": "chat-123",_35 "tweaks": {_35 "component_id": {_35 "parameter_name": "value"_35 }_35 }_35}_35_35# Request headers_35headers = {_35 "Content-Type": "application/json",_35 "accept": "application/json",_35 "x-api-key": "LANGFLOW_API_KEY"_35}_35_35try:_35 response = requests.post(url, json=payload, headers=headers, stream=True)_35 response.raise_for_status()_35_35 # Process streaming response_35 for line in response.iter_lines():_35 if line:_35 print(line.decode('utf-8'))_35except requests.exceptions.RequestException as e:_35 print(f"Error making API request: {e}")
_37const payload = {_37 input_value: "Tell me a story",_37 input_type: "chat",_37 output_type: "chat",_37 output_component: "chat_output",_37 session_id: "chat-123",_37 tweaks: {_37 component_id: {_37 parameter_name: "value"_37 }_37 }_37};_37_37const options = {_37 method: 'POST',_37 headers: {_37 'Content-Type': 'application/json',_37 'accept': 'application/json',_37 'x-api-key': 'LANGFLOW_API_KEY'_37 },_37 body: JSON.stringify(payload)_37};_37_37fetch('http://LANGFLOW_SERVER_URL/api/v1/run/FLOW_ID?stream=true', options)_37 .then(async response => {_37 const reader = response.body?.getReader();_37 const decoder = new TextDecoder();_37_37 if (reader) {_37 while (true) {_37 const { done, value } = await reader.read();_37 if (done) break;_37 console.log(decoder.decode(value));_37 }_37 }_37 })_37 .catch(err => console.error(err));
_17curl -X POST \_17 "$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID?stream=true" \_17 -H "Content-Type: application/json" \_17 -H "accept: application/json" \_17 -H "x-api-key: $LANGFLOW_API_KEY" \_17 -d '{_17 "input_value": "Tell me a story",_17 "input_type": "chat",_17 "output_type": "chat",_17 "output_component": "chat_output",_17 "session_id": "chat-123",_17 "tweaks": {_17 "component_id": {_17 "parameter_name": "value"_17 }_17 }_17 }'
Pass global variables in request headers
You can pass global variables to your flow using HTTP headers with the format X-LANGFLOW-GLOBAL-VAR-{VARIABLE_NAME}.
Variables passed in headers take precedence over OS environment variables. If a variable is provided in both a header and an environment variable, the header value is used. Variables are only available during this specific request execution and aren't persisted.
Variable names are automatically converted to uppercase. For example, X-LANGFLOW-GLOBAL-VAR-api-key becomes API_KEY in your flow.
You don't need to create these variables in Langflow's Global Variables section first. Pass any variable name using this header format.
- Python
- JavaScript
- curl
_26import requests_26_26url = "http://LANGFLOW_SERVER_URL/api/v1/run/FLOW_ID"_26_26# Request payload_26payload = {_26 "input_value": "Tell me about something interesting!",_26 "input_type": "chat",_26 "output_type": "chat"_26}_26_26# Request headers with global variables_26headers = {_26 "Content-Type": "application/json",_26 "x-api-key": "LANGFLOW_API_KEY",_26 "X-LANGFLOW-GLOBAL-VAR-OPENAI_API_KEY": "sk-...",_26 "X-LANGFLOW-GLOBAL-VAR-USER_ID": "user123",_26 "X-LANGFLOW-GLOBAL-VAR-ENVIRONMENT": "production"_26}_26_26try:_26 response = requests.post(url, json=payload, headers=headers)_26 response.raise_for_status()_26 print(response.json())_26except requests.exceptions.RequestException as e:_26 print(f"Error making API request: {e}")
_22const payload = {_22 input_value: "Tell me about something interesting!",_22 input_type: "chat",_22 output_type: "chat"_22};_22_22const options = {_22 method: 'POST',_22 headers: {_22 'Content-Type': 'application/json',_22 'x-api-key': 'LANGFLOW_API_KEY',_22 'X-LANGFLOW-GLOBAL-VAR-OPENAI_API_KEY': 'sk-...',_22 'X-LANGFLOW-GLOBAL-VAR-USER_ID': 'user123',_22 'X-LANGFLOW-GLOBAL-VAR-ENVIRONMENT': 'production'_22 },_22 body: JSON.stringify(payload)_22};_22_22fetch('http://LANGFLOW_SERVER_URL/api/v1/run/FLOW_ID', options)_22 .then(response => response.json())_22 .then(data => console.log(data))_22 .catch(err => console.error(err));
_12curl -X POST \_12 "$LANGFLOW_SERVER_URL/api/v1/run/$FLOW_ID" \_12 -H "Content-Type: application/json" \_12 -H "x-api-key: $LANGFLOW_API_KEY" \_12 -H "X-LANGFLOW-GLOBAL-VAR-OPENAI_API_KEY: sk-..." \_12 -H "X-LANGFLOW-GLOBAL-VAR-USER_ID: user123" \_12 -H "X-LANGFLOW-GLOBAL-VAR-ENVIRONMENT: production" \_12 -d '{_12 "input_value": "Tell me about something interesting!",_12 "input_type": "chat",_12 "output_type": "chat"_12 }'
If your flow components reference variables that aren't provided in headers or your Langflow database, the flow fails by default. To avoid this, you can set LANGFLOW_FALLBACK_TO_ENV_VAR=True in your .env file, which allows the flow to use values from OS environment variables if they aren't otherwise specified.
Webhook run flow
Use the /webhook endpoint to start a flow by sending an HTTP POST request.
After you add a Webhook component to a flow, open the API access pane, and then click the Webhook curl tab to get an automatically generated POST /webhook request for your flow.
For more information, see Trigger flows with webhooks.
_10curl -X POST \_10 "$LANGFLOW_SERVER_URL/api/v1/webhook/$FLOW_ID" \_10 -H "Content-Type: application/json" \_10 -H "x-api-key: $LANGFLOW_API_KEY" \_10 -d '{"data": "example-data"}'
Result
_10{_10 "message": "Task started in the background",_10 "status": "in progress"_10}
Deprecated flow trigger endpoints
The following endpoints are deprecated and replaced by the /run endpoint:
/process/predict