Skip to main content

Langflow TypeScript client

The Langflow TypeScript client allows your TypeScript applications to programmatically interact with the Langflow API.

For the client code repository, see langflow-client-ts.

For the npm package, see @datastax/langflow-client.

Install the Langflow TypeScript package

To install the Langflow typescript client package, use one of the following commands:


_10
npm install @datastax/langflow-client

Initialize the Langflow TypeScript client

  1. Import the client into your code.

_10
import { LangflowClient } from "@datastax/langflow-client";

  1. Initialize a client object to interact with your server. The LangflowClient object allows you to interact with the Langflow API.

Replace BASE_URL and API_KEY with values from your deployment. The default Langflow base URL is http://localhost:7860. To create an API key, see API keys.


_10
const baseUrl = "BASE_URL";
_10
const apiKey = "API_KEY";
_10
const client = new LangflowClient({ baseUrl, apiKey });

Langflow TypeScript client quickstart

  1. With your Langflow client initialized, submit a message to your Langflow server and receive a response. This example uses the minimum values for sending a message and running your flow on a Langflow server, with no API keys. Replace baseUrl and flowId with values from your deployment. The input string is the message you're sending to your flow.

_15
import { LangflowClient } from "@datastax/langflow-client";
_15
_15
const baseUrl = "http://127.0.0.1:7860";
_15
const client = new LangflowClient({ baseUrl });
_15
_15
async function runFlow() {
_15
const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf";
_15
const flow = client.flow(flowId);
_15
const input = "Is anyone there?";
_15
_15
const response = await flow.run(input);
_15
console.log(response);
_15
}
_15
_15
runFlow().catch(console.error);

Response

_10
FlowResponse {
_10
sessionId: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf',
_10
outputs: [ { inputs: [Object], outputs: [Array] } ]
_10
}

This confirms your client is connecting to Langflow.

  • The sessionID value is a unique identifier for the client-server session. For more information, see Session ID.
  • The outputs array contains the results of your flow execution.
  1. To get the full response objects from your server, change the console.log code to stringify the returned JSON object:

_10
console.log(JSON.stringify(response, null, 2));

The exact structure of the returned inputs and outputs depends on how your flow is configured in Langflow.

  1. To get the first chat message returned from the chat output component, change console.log to use the chatOutputText convenience function.

_10
console.log(response.chatOutputText());

Use advanced TypeScript client features

The TypeScript client can do more than just connect to your server and run a flow.

This example builds on the quickstart with additional features for interacting with Langflow.

  1. Pass tweaks to your code as an object with the request. Tweaks change values within components for all calls to your flow. This example tweaks the Open-AI model component to enforce using the gpt-4o-mini model.

_10
const tweaks = { model_name: "gpt-4o-mini" };

  1. Pass a session ID with the request to maintain the same conversation with the LLM from this application.

_10
const session_id = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf";

  1. Instead of calling run on the Flow object, call stream with the same arguments. The response is a ReadableStream of objects. For more information on streaming Langflow responses, see Run flow.

_10
const response = await client.flow(flowId).stream(input);
_10
_10
for await (const event of response) {
_10
console.log(event);
_10
}

  1. Run the completed TypeScript application to call your server with tweaks and session_id, and stream the response back. Replace baseUrl and flowId with values from your deployment.

_22
import { LangflowClient } from "@datastax/langflow-client";
_22
_22
const baseUrl = "http://127.0.0.1:7860";
_22
const client = new LangflowClient({ baseUrl });
_22
_22
async function runFlow() {
_22
const flowId = "aa5a238b-02c0-4f03-bc5c-cc3a83335cdf";
_22
const input = "Is anyone there?";
_22
const tweaks = { model_name: "gpt-4o-mini" };
_22
const session_id = "test-session";
_22
_22
const response = await client.flow(flowId).stream(input, {
_22
session_id,
_22
tweaks,
_22
});
_22
_22
for await (const event of response) {
_22
console.log(event);
_22
}
_22
_22
}
_22
runFlow().catch(console.error);

Response

_68
{
_68
event: 'add_message',
_68
data: {
_68
timestamp: '2025-05-23 15:52:48 UTC',
_68
sender: 'User',
_68
sender_name: 'User',
_68
session_id: 'test-session',
_68
text: 'Is anyone there?',
_68
files: [],
_68
error: false,
_68
edit: false,
_68
properties: {
_68
text_color: '',
_68
background_color: '',
_68
edited: false,
_68
source: [Object],
_68
icon: '',
_68
allow_markdown: false,
_68
positive_feedback: null,
_68
state: 'complete',
_68
targets: []
_68
},
_68
category: 'message',
_68
content_blocks: [],
_68
id: '7f096715-3f2d-4d84-88d6-5e2f76bf3fbe',
_68
flow_id: 'aa5a238b-02c0-4f03-bc5c-cc3a83335cdf',
_68
duration: null
_68
}
_68
}
_68
{
_68
event: 'token',
_68
data: {
_68
chunk: 'Absolutely',
_68
id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',
_68
timestamp: '2025-05-23 15:52:48 UTC'
_68
}
_68
}
_68
{
_68
event: 'token',
_68
data: {
_68
chunk: ',',
_68
id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',
_68
timestamp: '2025-05-23 15:52:48 UTC'
_68
}
_68
}
_68
{
_68
event: 'token',
_68
data: {
_68
chunk: " I'm",
_68
id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',
_68
timestamp: '2025-05-23 15:52:48 UTC'
_68
}
_68
}
_68
{
_68
event: 'token',
_68
data: {
_68
chunk: ' here',
_68
id: 'c5a99314-6b23-488b-84e2-038aa3e87fb5',
_68
timestamp: '2025-05-23 15:52:48 UTC'
_68
}
_68
}
_68
_68
// this response is abbreviated
_68
_68
{
_68
event: 'end',
_68
data: { result: { session_id: 'test-session', outputs: [Array] } }
_68
}

Retrieve Langflow logs with the TypeScript client

To retrieve Langflow logs, you must enable log retrieval on your Langflow server by including the following values in your server's .env file:


_10
LANGFLOW_ENABLE_LOG_RETRIEVAL=true
_10
LANGFLOW_LOG_RETRIEVER_BUFFER_SIZE=10000
_10
LANGFLOW_LOG_LEVEL=DEBUG

For more information, see API examples.

This complete example starts streaming logs in the background, and then runs a flow so you can see how a flow executes. Replace baseUrl and flowId with values from your deployment.


_26
import { LangflowClient } from "@datastax/langflow-client";
_26
_26
const baseUrl = "http://127.0.0.1:7863";
_26
const flowId = "86f0bf45-0544-4e88-b0b1-8e622da7a7f0";
_26
_26
async function runFlow(client: LangflowClient) {
_26
const input = "Is anyone there?";
_26
const response = await client.flow(flowId).run(input);
_26
console.log('Flow response:', response);
_26
}
_26
_26
async function main() {
_26
const client = new LangflowClient({ baseUrl: baseUrl });
_26
_26
// Start streaming logs
_26
console.log('Starting log stream...');
_26
for await (const log of await client.logs.stream()) {
_26
console.log('Log:', log);
_26
}
_26
_26
// Run the flow
_26
await runFlow(client);
_26
_26
}
_26
_26
main().catch(console.error);

Logs begin streaming indefinitely, and the flow runs once.

The logs below are abbreviated, but you can monitor how the flow instantiates its components, configures its model, and processes the outputs.

Response

_57
Starting log stream...
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.006Z,
_57
message: '2025-05-30T07:49:16.006127-0400 DEBUG Instantiating ChatInput of type component\n'
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.029Z,
_57
message: '2025-05-30T07:49:16.029957-0400 DEBUG Instantiating Prompt of type component\n'
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.049Z,
_57
message: '2025-05-30T07:49:16.049520-0400 DEBUG Instantiating ChatOutput of type component\n'
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.069Z,
_57
message: '2025-05-30T07:49:16.069359-0400 DEBUG Instantiating OpenAIModel of type component\n'
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.086Z,
_57
message: "2025-05-30T07:49:16.086426-0400 DEBUG Running layer 0 with 2 tasks, ['ChatInput-xjucM', 'Prompt-I3pxU']\n"
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.101Z,
_57
message: '2025-05-30T07:49:16.101766-0400 DEBUG Building Chat Input\n'
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.113Z,
_57
message: '2025-05-30T07:49:16.113343-0400 DEBUG Building Prompt\n'
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.131Z,
_57
message: '2025-05-30T07:49:16.131423-0400 DEBUG Logged vertex build: 6bd9fe9c-5eea-4f05-a96d-f6de9dc77e3c\n'
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.143Z,
_57
message: '2025-05-30T07:49:16.143295-0400 DEBUG Logged vertex build: 39c68ec9-3859-4fff-9b14-80b3271f8fbf\n'
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.188Z,
_57
message: "2025-05-30T07:49:16.188730-0400 DEBUG Running layer 1 with 1 tasks, ['OpenAIModel-RtlZm']\n"
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.201Z,
_57
message: '2025-05-30T07:49:16.201946-0400 DEBUG Building OpenAI\n'
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:16.216Z,
_57
message: '2025-05-30T07:49:16.216622-0400 INFO Model name: gpt-4.1-mini\n'
_57
}
_57
Flow response: FlowResponse {
_57
sessionId: '86f0bf45-0544-4e88-b0b1-8e622da7a7f0',
_57
outputs: [ { inputs: [Object], outputs: [Array] } ]
_57
}
_57
Log: Log {
_57
timestamp: 2025-05-30T11:49:18.094Z,
_57
message: `2025-05-30T07:49:18.094364-0400 DEBUG Vertex OpenAIModel-RtlZm, result: <langflow.graph.utils.UnbuiltResult object at 0x364d24dd0>, object: {'text_output': "Hey there! I'm here and ready to help you build something awesome with AI. What are you thinking about creating today?"}\n`
_57
}

The FlowResponse object is returned to the client, with the outputs array including your flow result.

Langflow TypeScript project repository

You can do even more with the Langflow TypeScript client.

For more information, see the langflow-client-ts repository.

Search