Deploy Langflow on watsonx Orchestrate
Create a flow and deploy it to IBM watsonx Orchestrate.
Deploying a flow on IBM watsonx Orchestrate is different from the other Langflow deployment options. This workflow does not deploy a full-featured Langflow server and flow builder UI. Instead, Langflow packages your selected flow and flow version, and then publishes it to IBM watsonx Orchestrate as a tool that an IBM watsonx Orchestrate agent can call. Langflow is used to build and configure the flow, while IBM watsonx Orchestrate hosts the agent experience and invokes the deployed flow as part of that agent's toolset.
Prerequisites
- Install and start Langflow
- Create an OpenAI API key
- Create an IBM watsonx Orchestrate instance
Create and deploy a flow
-
Create a flow in the Langflow UI, such as the Simple Agent starter flow in the Quickstart.
-
Click Deploy. The Provider pane opens.
-
Enter the Name, Service Instance URL, and API Key from your IBM watsonx Orchestrate instance. These values are found in the Settings page of your IBM watsonx Orchestrate instance.
- Name:
YOUR_DEPLOYMENT_NAME - Service Instance URL:
https://api.dl.watson-orchestrate.ibm.com/instances/80194572-4421-6735-91ab-55c0d8e4f962 - API Key:
YOUR_WATSONX_ORCHESTRATE_API_KEY
The last segment of the Service Instance URL is the IBM watsonx Orchestrate tenant ID, which can be found in your watsonx Orchestrate deployment. In this example, the tenant ID is
80194572-4421-6735-91ab-55c0d8e4f962. - Name:
-
Click Next. The Deployment Type pane opens.
-
Enter a Type, Agent Name, Model, and Description.
The Type is always Agent. The deployed flow is an IBM watsonx Orchestrate agent with your flow available as a tool the agent can call.
The Model list is populated from the connected watsonx Orchestrate instance, not Langflow.
-
To open the Attach Flows pane, click the Attach Flows tab. Select a flow and flow version to deploy.
-
To open the Create Connections pane, click the Create Connections tab. Create a new connection, or select an existing connection to bind to the flow.
To create a new connection, do the following:
-
Enter a Connection Name and any environment variables the flow requires, such as the
OPENAI_API_KEY. Langflow auto-detects global variables from the flow JSON file, and you can add additional variables. -
To add the new connection to the list of available connections, click Create Connection.
-
In the list of available connections, select the new connection, and then click Attach Connection to Flow.
tipTo bind the connection to the flow without environment variable binding, click Skip, and then click Next.
For more information, see Build flows.
-
-
Click Next. The Review & Confirm pane opens.
-
Confirm the deployment values are correct, and then click Deploy.
Langflow installs any required extra dependencies on your watsonx Orchestrate tenant automatically.
In the Langflow UI,
Deployment successfulindicates your deployment succeeded.tipIf you get an error that the tool name already exists on your deployment, click Edit to change the tool name.
-
Click Test to open a chat window with your agent on watsonx Orchestrate. Enter a question, and the agent responds using the connected flow as a tool.
-
Navigate to your IBM watsonx Orchestrate deployment, and then confirm that your Langflow flow is listed as an agent.
Manage deployments in Langflow
From the Projects page, click Deployments to open the deployment management screen.
-
Deployments:
A Deployment is a published watsonx Orchestrate agent created from a specific Langflow flow version. Deployment details include the agent name, type, attached flows, model, and the IBM watsonx Orchestrate environment it belongs to.
Use the Deployments tab to create, update, view, and delete flow deployments in Langflow.
-
Deployment Environments:
A Deployment Environment is a saved watsonx Orchestrate target that Langflow can deploy to. An environment stores the connection details for a watsonx Orchestrate tenant.
Use the Deployment Environments tab to connect, view, and disconnect IBM watsonx Orchestrate environments in Langflow.
To manage the tenant itself, use the IBM watsonx Orchestrate dashboard.
Send requests to your flow
After you deploy your flow to IBM watsonx Orchestrate, you can connect to it through the Langflow deployment run endpoints.
Don't use the /run endpoint for flows deployed to IBM watsonx Orchestrate.
Instead use POST /api/v1/deployments/{deployment_id}/runs to start a run, and GET /api/v1/deployments/{deployment_id}/runs/{run_id} to check its status.
Endpoint paths must be prefixed with your Langflow server URL, such as http://localhost:7860.
Create deployment run endpoint
Endpoint: POST /api/v1/deployments/{deployment_id}/runs
Description: Start a run for a deployed flow and return a provider-owned run ID that you can poll for status.
Example request
- Python
- JavaScript
- curl
_20import requests_20_20url = "http://LANGFLOW_SERVER_ADDRESS/api/v1/deployments/DEPLOYMENT_ID/runs"_20_20payload = {_20 "provider_data": {_20 "input": "Summarize today's tickets",_20 "thread_id": "thread-123"_20 }_20}_20_20headers = {_20 "Content-Type": "application/json",_20 "x-api-key": "LANGFLOW_API_KEY"_20}_20_20response = requests.post(url, json=payload, headers=headers)_20response.raise_for_status()_20_20print(response.json())
_20const payload = {_20 provider_data: {_20 input: "Summarize today's tickets",_20 thread_id: "thread-123"_20 }_20};_20_20const options = {_20 method: "POST",_20 headers: {_20 "Content-Type": "application/json",_20 "x-api-key": "LANGFLOW_API_KEY"_20 },_20 body: JSON.stringify(payload)_20};_20_20fetch("http://LANGFLOW_SERVER_ADDRESS/api/v1/deployments/DEPLOYMENT_ID/runs", options)_20 .then((response) => response.json())_20 .then((response) => console.log(response))_20 .catch((err) => console.error(err));
_10curl --request POST \_10 --url "http://LANGFLOW_SERVER_ADDRESS/api/v1/deployments/DEPLOYMENT_ID/runs" \_10 --header "Content-Type: application/json" \_10 --header "x-api-key: LANGFLOW_API_KEY" \_10 --data '{_10 "provider_data": {_10 "input": "Summarize today's tickets",_10 "thread_id": "thread-123"_10 }_10 }'
Request body
| Field | Type | Required | Description |
|---|---|---|---|
provider_data.input | string | Yes | The prompt or message content to send to the deployed agent. |
provider_data.thread_id | string | No | Optional thread identifier to continue an existing conversation. |
Example response
_15{_15 "deployment_id": "3ea34379-1f72-4a33-9f6e-9e3ca88365b5",_15 "provider_data": {_15 "id": "run-42",_15 "agent_id": "agent-123",_15 "thread_id": "thread-123",_15 "status": "accepted",_15 "result": null,_15 "started_at": null,_15 "completed_at": null,_15 "failed_at": null,_15 "cancelled_at": null,_15 "last_error": null_15 }_15}
Response body
The response returns the Langflow deployment_id and a provider_data object containing the provider-owned run metadata.
Use provider_data.id as the run_id when checking the run status.
Get deployment run status endpoint
Endpoint: GET /api/v1/deployments/{deployment_id}/runs/{run_id}
Description: Retrieve the current status and result of a deployment run.
Example request
- Python
- JavaScript
- curl
_13import requests_13_13url = "http://LANGFLOW_SERVER_ADDRESS/api/v1/deployments/DEPLOYMENT_ID/runs/RUN_ID"_13_13headers = {_13 "Content-Type": "application/json",_13 "x-api-key": "LANGFLOW_API_KEY"_13}_13_13response = requests.get(url, headers=headers)_13response.raise_for_status()_13_13print(response.json())
_12const options = {_12 method: "GET",_12 headers: {_12 "Content-Type": "application/json",_12 "x-api-key": "LANGFLOW_API_KEY"_12 }_12};_12_12fetch("http://LANGFLOW_SERVER_ADDRESS/api/v1/deployments/DEPLOYMENT_ID/runs/RUN_ID", options)_12 .then((response) => response.json())_12 .then((response) => console.log(response))_12 .catch((err) => console.error(err));
_10curl --request GET \_10 --url "http://LANGFLOW_SERVER_ADDRESS/api/v1/deployments/DEPLOYMENT_ID/runs/RUN_ID" \_10 --header "Content-Type: application/json" \_10 --header "x-api-key: LANGFLOW_API_KEY"
Path parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
deployment_id | uuid | Yes | The Langflow deployment ID for the deployed flow. |
run_id | string | Yes | The provider-owned run ID returned in provider_data.id. |
Example response
_17{_17 "deployment_id": "3ea34379-1f72-4a33-9f6e-9e3ca88365b5",_17 "provider_data": {_17 "id": "run-42",_17 "agent_id": "agent-123",_17 "thread_id": "thread-123",_17 "status": "completed",_17 "result": {_17 "output": "Here is your summary..."_17 },_17 "started_at": "2026-04-03T12:40:00Z",_17 "completed_at": "2026-04-03T12:40:05Z",_17 "failed_at": null,_17 "cancelled_at": null,_17 "last_error": null_17 }_17}
Response body
Check provider_data.status to determine whether the run is still processing or has finished.
When the status is completed, read the output from provider_data.result.