Skip to main content
Version: 1.10.x (Next)

Build endpoints

info

The /build endpoints are used by Langflow's frontend visual editor code. These endpoints are part of the internal Langflow codebase.

Don't use these endpoints to run flows in applications that use your Langflow flows. To run flows in your apps, see Flow trigger endpoints.

The /build endpoints support Langflow's frontend code for building flows in the Langflow visual editor. You can use these endpoints to build vertices and flows, as well as execute flows with streaming event responses. You might need to use or understand these endpoints when contributing to the Langflow codebase.

Build flow and stream events

This endpoint builds and executes a flow, returning a job ID that can be used to stream execution events.

  1. Send a POST request to the /build/$FLOW_ID/flow endpoint:

    1import os 2 3import requests 4 5url = f"{os.getenv('LANGFLOW_URL', '')}/api/v1/build/{os.getenv('FLOW_ID', '')}/flow" 6 7headers = { 8 "accept": "application/json", 9 "Content-Type": "application/json", 10 "x-api-key": f"{os.getenv('LANGFLOW_API_KEY', '')}", 11} 12 13payload = {"inputs": {"input_value": "Tell me a story"}} 14 15response = requests.request("POST", url, headers=headers, json=payload) 16response.raise_for_status() 17 18print(response.text) 19
    Result
    1{ 2 "job_id": "123e4567-e89b-12d3-a456-426614174000" 3} 4
  2. After receiving a job ID from the build endpoint, use the /build/$JOB_ID/events endpoint to stream the execution results:

    1import os 2 3import requests 4 5url = f"{os.getenv('LANGFLOW_URL', '')}/api/v1/build/{os.getenv('JOB_ID', '')}/events" 6 7headers = { 8 "accept": "application/json", 9 "x-api-key": f"{os.getenv('LANGFLOW_API_KEY', '')}", 10} 11 12response = requests.request("GET", url, headers=headers) 13response.raise_for_status() 14 15print(response.text) 16
    Result
    1{"event": "vertices_sorted", "data": {"ids": ["ChatInput-XtBLx"], "to_run": ["Prompt-x74Ze", "ChatOutput-ylMzN", "ChatInput-XtBLx", "OpenAIModel-d1wOZ"]}} 2 3{"event": "add_message", "data": {"timestamp": "2025-03-03T17:42:23", "sender": "User", "sender_name": "User", "session_id": "d2bbd92b-187e-4c84-b2d4-5df365704201", "text": "Tell me a story", "files": [], "error": false, "edit": false, "properties": {"text_color": "", "background_color": "", "edited": false, "source": {"id": null, "display_name": null, "source": null}, "icon": "", "allow_markdown": false, "positive_feedback": null, "state": "complete", "targets": []}, "category": "message", "content_blocks": [], "id": "28879bd8-6a68-4dd5-b658-74d643a4dd92", "flow_id": "d2bbd92b-187e-4c84-b2d4-5df365704201"}} 4 5// ... Additional events as the flow executes ... 6 7{"event": "end", "data": {}} 8

The /build/$FLOW_ID/events endpoint has a stream query parameter that defaults to true. To disable streaming and get all events at once, set ?stream=false.

1import os 2 3import requests 4 5base = os.environ.get("LANGFLOW_URL") or os.environ.get("LANGFLOW_SERVER_URL", "") 6job_id = os.environ.get("JOB_ID", "") 7api_key = os.environ.get("LANGFLOW_API_KEY", "") 8 9# Use the API's `event_delivery` query param to avoid keeping a streaming connection open. 10# For local smoke tests, polling returns a finite JSON response. 11url = f"{base}/api/v1/build/{job_id}/events?event_delivery=polling" 12 13headers = {"accept": "application/json", "x-api-key": api_key} 14 15response = requests.get(url, headers=headers, timeout=60) 16response.raise_for_status() 17 18print(response.text) 19

Build headers

HeaderInfoExample
Content-TypeRequired. Specifies the JSON format."application/json"
acceptOptional. Specifies the response format."application/json"
x-api-keyOptional. Required only if authentication is enabled."sk-..."

Build parameters

ParameterTypeDescription
inputsobjectOptional. Input values for flow components.
dataobjectOptional. Flow data to override stored configuration.
filesarray[string]Optional. List of file paths to use.
start_component_idstringOptional. ID of the component where the execution should start. Component id values can be found in Langflow JSON files
stop_component_idstringOptional. ID of the component where the execution should stop. Component id values can be found in Langflow JSON files.
log_buildsBooleanWhether to record build logs. Default: Enabled (true).

Set start and stop points

The /build endpoint accepts optional values for start_component_id and stop_component_id to control where the flow run starts and stops. Setting stop_component_id for a component triggers the same behavior as clicking Run component on that component in the visual editor: The specified component and all dependent components leading up to that component will run.

The following example stops flow execution at an OpenAI component:

1import os 2 3import requests 4 5url = f"{os.getenv('LANGFLOW_URL', '')}/api/v1/build/{os.getenv('FLOW_ID', '')}/flow" 6 7headers = { 8 "accept": "application/json", 9 "Content-Type": "application/json", 10 "x-api-key": f"{os.getenv('LANGFLOW_API_KEY', '')}", 11} 12 13payload = {"stop_component_id": "OpenAIModel-Uksag"} 14 15response = requests.request("POST", url, headers=headers, json=payload) 16response.raise_for_status() 17 18print(response.text) 19

Override flow parameters

The /build endpoint also accepts inputs for data directly, instead of using the values stored in the Langflow database. This is useful for running flows without having to pass custom values through the visual editor.

1import os 2 3import requests 4 5url = f"{os.getenv('LANGFLOW_URL', '')}/api/v1/build/{os.getenv('FLOW_ID', '')}/flow" 6 7headers = { 8 "accept": "application/json", 9 "Content-Type": "application/json", 10 "x-api-key": f"{os.getenv('LANGFLOW_API_KEY', '')}", 11} 12 13payload = { 14 "data": {"nodes": [], "edges": []}, 15 "inputs": {"input_value": "Your custom input here", "session": "session_id"}, 16} 17 18response = requests.request("POST", url, headers=headers, json=payload) 19response.raise_for_status() 20 21print(response.text) 22
Result
1{ "job_id": "0bcc7f23-40b4-4bfa-9b8a-a44181fd1175" } 2

See also

Search