Langflow TypeScript client
The Langflow TypeScript client allows your TypeScript applications to programmatically interact with the Langflow API.
For the client code repository, see langflow-client-ts.
For the npm package, see @datastax/langflow-client.
Install the Langflow TypeScript package
To install the Langflow typescript client package, use one of the following commands:
- npm
- yarn
- pnpm
_10npm install @datastax/langflow-client
_10yarn add @datastax/langflow-client
_10pnpm add @datastax/langflow-client
Initialize the Langflow TypeScript client
- Import the client into your code.
_10import { LangflowClient } from "@datastax/langflow-client";
- Initialize a client object to interact with your server.
The
LangflowClient
object allows you to interact with the Langflow API.
Replace BASE_URL
and API_KEY
with values from your deployment.
The default Langflow base URL is http://localhost:7860
.
To create an API key, see API keys.
_10const baseUrl = "BASE_URL";_10const apiKey = "API_KEY";_10const client = new LangflowClient({ baseUrl, apiKey });
Langflow TypeScript client quickstart
- With your Langflow client initialized, submit a message to your Langflow server and receive a response.
This example uses the minimum values for sending a message and running your flow on a Langflow server, with no API keys.
Replace
baseUrl
andflowId
with values from your deployment. Theinput
string is the message you're sending to your flow.
_15import { LangflowClient } from "@datastax/langflow-client";_15_15const baseUrl = "http://127.0.0.1:7860";_15const client = new LangflowClient({ baseUrl });_15_15async function runFlow() {_15 const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf";_15 const flow = client.flow(flowId);_15 const input = "Is anyone there?";_15_15 const response = await flow.run(input);_15 console.log(response);_15}_15_15runFlow().catch(console.error);
Response
_10FlowResponse {_10 sessionId: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf',_10 outputs: [ { inputs: [Object], outputs: [Array] } ]_10}
This confirms your client is connecting to Langflow.
- The
sessionID
value is a unique identifier for the client-server session. For more information, see Session ID. - The
outputs
array contains the results of your flow execution.
- To get the full response objects from your server, change the
console.log
code to stringify the returned JSON object:
_10console.log(JSON.stringify(response, null, 2));
The exact structure of the returned inputs
and outputs
depends on how your flow is configured in Langflow.
- To get the first chat message returned from the chat output component, change
console.log
to use thechatOutputText
convenience function.
_10console.log(response.chatOutputText());
Use advanced TypeScript client features
The TypeScript client can do more than just connect to your server and run a flow.
This example builds on the quickstart with additional features for interacting with Langflow.
- Pass tweaks to your code as an object with the request.
Tweaks change values within components for all calls to your flow.
This example tweaks the Open-AI model component to enforce using the
gpt-4o-mini
model.
_10const tweaks = { model_name: "gpt-4o-mini" };
- Pass a session ID with the request to maintain the same conversation with the LLM from this application.
_10const session_id = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf";
- Instead of calling
run
on the Flow object, callstream
with the same arguments. The response is a ReadableStream of objects. For more information on streaming Langflow responses, see Run flow.
_10const response = await client.flow(flowId).stream(input);_10_10for await (const event of response) {_10 console.log(event);_10}
- Run the completed TypeScript application to call your server with
tweaks
andsession_id
, and stream the response back. ReplacebaseUrl
andflowId
with values from your deployment.
_22import { LangflowClient } from "@datastax/langflow-client";_22_22const baseUrl = "http://127.0.0.1:7860";_22const client = new LangflowClient({ baseUrl });_22_22async function runFlow() {_22 const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf";_22 const input = "Is anyone there?";_22 const tweaks = { model_name: "gpt-4o-mini" };_22 const session_id = "test-session";_22_22 const response = await client.flow(flowId).stream(input, {_22 session_id,_22 tweaks,_22 });_22_22 for await (const event of response) {_22 console.log(event);_22 }_22_22}_22runFlow().catch(console.error);
Response
_68{_68 event: 'add_message',_68 data: {_68 timestamp: '2025-05-23 15:52:48 UTC',_68 sender: 'User',_68 sender_name: 'User',_68 session_id: 'test-session',_68 text: 'Is anyone there?',_68 files: [],_68 error: false,_68 edit: false,_68 properties: {_68 text_color: '',_68 background_color: '',_68 edited: false,_68 source: [Object],_68 icon: '',_68 allow_markdown: false,_68 positive_feedback: null,_68 state: 'complete',_68 targets: []_68 },_68 category: 'message',_68 content_blocks: [],_68 id: '7f096715-3f2d-4d84-88d6-5e2f76bf3fbe',_68 flow_id: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf',_68 duration: null_68 }_68}_68{_68 event: 'token',_68 data: {_68 chunk: 'Absolutely',_68 id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',_68 timestamp: '2025-05-23 15:52:48 UTC'_68 }_68}_68{_68 event: 'token',_68 data: {_68 chunk: ',',_68 id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',_68 timestamp: '2025-05-23 15:52:48 UTC'_68 }_68}_68{_68 event: 'token',_68 data: {_68 chunk: " I'm",_68 id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',_68 timestamp: '2025-05-23 15:52:48 UTC'_68 }_68}_68{_68 event: 'token',_68 data: {_68 chunk: ' here',_68 id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',_68 timestamp: '2025-05-23 15:52:48 UTC'_68 }_68}_68_68// this response is abbreviated_68_68{_68 event: 'end',_68 data: { result: { session_id: 'test-session', outputs: [Array] } }_68}
Retrieve Langflow logs with the TypeScript client
To retrieve Langflow logs, you must enable log retrieval on your Langflow server by including the following values in your server's .env
file:
_10LANGFLOW_ENABLE_LOG_RETRIEVAL=true_10LANGFLOW_LOG_RETRIEVER_BUFFER_SIZE=10000_10LANGFLOW_LOG_LEVEL=DEBUG
For more information, see API examples.
This complete example starts streaming logs in the background, and then runs a flow so you can see how a flow executes.
Replace baseUrl
and flowId
with values from your deployment.
_26import { LangflowClient } from "@datastax/langflow-client";_26_26const baseUrl = "http://127.0.0.1:7863";_26const flowId = "86f0bf45-0544-4e88-b0b1-8e622da7a7f0";_26_26async function runFlow(client: LangflowClient) {_26 const input = "Is anyone there?";_26 const response = await client.flow(flowId).run(input);_26 console.log('Flow response:', response);_26}_26_26async function main() {_26 const client = new LangflowClient({ baseUrl: baseUrl });_26_26 // Start streaming logs_26 console.log('Starting log stream...');_26 for await (const log of await client.logs.stream()) {_26 console.log('Log:', log);_26 }_26_26 // Run the flow_26 await runFlow(client);_26_26}_26_26main().catch(console.error);
Logs begin streaming indefinitely, and the flow runs once.
The logs below are abbreviated, but you can monitor how the flow instantiates its components, configures its model, and processes the outputs.
Response
_57Starting log stream..._57Log: Log {_57 timestamp: 2025-05-30T11:49:16.006Z,_57 message: '2025-05-30T07:49:16.006127-0400 DEBUG Instantiating ChatInput of type component\n'_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.029Z,_57 message: '2025-05-30T07:49:16.029957-0400 DEBUG Instantiating Prompt of type component\n'_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.049Z,_57 message: '2025-05-30T07:49:16.049520-0400 DEBUG Instantiating ChatOutput of type component\n'_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.069Z,_57 message: '2025-05-30T07:49:16.069359-0400 DEBUG Instantiating OpenAIModel of type component\n'_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.086Z,_57 message: "2025-05-30T07:49:16.086426-0400 DEBUG Running layer 0 with 2 tasks, ['ChatInput-xjucM', 'Prompt-I3pxU']\n"_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.101Z,_57 message: '2025-05-30T07:49:16.101766-0400 DEBUG Building Chat Input\n'_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.113Z,_57 message: '2025-05-30T07:49:16.113343-0400 DEBUG Building Prompt\n'_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.131Z,_57 message: '2025-05-30T07:49:16.131423-0400 DEBUG Logged vertex build: 6bd9fe9c-5eea-4f05-a96d-f6de9dc77e3c\n'_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.143Z,_57 message: '2025-05-30T07:49:16.143295-0400 DEBUG Logged vertex build: 39c68ec9-3859-4fff-9b14-80b3271f8fbf\n'_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.188Z,_57 message: "2025-05-30T07:49:16.188730-0400 DEBUG Running layer 1 with 1 tasks, ['OpenAIModel-RtlZm']\n"_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.201Z,_57 message: '2025-05-30T07:49:16.201946-0400 DEBUG Building OpenAI\n'_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:16.216Z,_57 message: '2025-05-30T07:49:16.216622-0400 INFO Model name: gpt-4.1-mini\n'_57}_57Flow response: FlowResponse {_57 sessionId: '86f0bf45-0544-4e88-b0b1-8e622da7a7f0',_57 outputs: [ { inputs: [Object], outputs: [Array] } ]_57}_57Log: Log {_57 timestamp: 2025-05-30T11:49:18.094Z,_57 message: `2025-05-30T07:49:18.094364-0400 DEBUG Vertex OpenAIModel-RtlZm, result: <langflow.graph.utils.UnbuiltResult object at 0x364d24dd0>, object: {'text_output': "Hey there! I'm here and ready to help you build something awesome with AI. What are you thinking about creating today?"}\n`_57}
The FlowResponse
object is returned to the client, with the outputs
array including your flow result.
Langflow TypeScript project repository
You can do even more with the Langflow TypeScript client.
For more information, see the langflow-client-ts repository.