Skip to main content

Deploy Langflow on Kubernetes

This guide demonstrates deploying Langflow on a Kubernetes cluster.

Two charts are available at the Langflow Helm Charts repository:

  • Deploy the Langflow IDE for the complete Langflow development environment.
  • Deploy the Langflow runtime to deploy a standalone Langflow application in a more secure and stable environment.

Deploy the Langflow IDE

The Langflow IDE deployment is a complete environment for developers to create, test, and debug their flows. It includes both the API and the UI.

The langflow-ide Helm chart is available in the Langflow Helm Charts repository.

Prerequisites

Prepare a Kubernetes cluster

This example uses Minikube, but you can use any Kubernetes cluster.

  1. Create a Kubernetes cluster on Minikube.


    _10
    minikube start

  2. Set kubectl to use Minikube.


    _10
    kubectl config use-context minikube

Install the Langflow IDE Helm chart

  1. Add the repository to Helm and update it.


    _10
    helm repo add langflow https://langflow-ai.github.io/langflow-helm-charts
    _10
    helm repo update

  2. Install Langflow with the default options in the langflow namespace.


    _10
    helm install langflow-ide langflow/langflow-ide -n langflow --create-namespace

  3. Check the status of the pods


    _10
    kubectl get pods -n langflow


    _10
    NAME READY STATUS RESTARTS AGE
    _10
    langflow-0 1/1 Running 0 33s
    _10
    langflow-frontend-5d9c558dbb-g7tc9 1/1 Running 0 38s

Configure port forwarding to access Langflow

Enable local port forwarding to access Langflow from your local machine.

  1. To make the Langflow API accessible from your local machine at port 7860:

_10
kubectl port-forward -n langflow svc/langflow-service-backend 7860:7860

  1. To make the Langflow UI accessible from your local machine at port 8080:

_10
kubectl port-forward -n langflow svc/langflow-service 8080:8080

Now you can access:

  • The Langflow API at http://localhost:7860
  • The Langflow UI at http://localhost:8080

Configure the Langflow version

Langflow is deployed with the latest version by default.

To specify a different Langflow version, set the langflow.backend.image.tag and langflow.frontend.image.tag values in the values.yaml file.


_10
langflow:
_10
backend:
_10
image:
_10
tag: "1.0.0a59"
_10
frontend:
_10
image:
_10
tag: "1.0.0a59"

Configure external storage

By default, the chart deploys a SQLite database stored in a local persistent disk. If you want to use an external PostgreSQL database, you can configure it in two ways:

  • Use the built-in PostgreSQL chart:

_10
postgresql:
_10
enabled: true
_10
auth:
_10
username: "langflow"
_10
password: "langflow-postgres"
_10
database: "langflow-db"

  • Use an external database:

_22
postgresql:
_22
enabled: false
_22
_22
langflow:
_22
backend:
_22
externalDatabase:
_22
enabled: true
_22
driver:
_22
value: "postgresql"
_22
port:
_22
value: "5432"
_22
user:
_22
value: "langflow"
_22
password:
_22
valueFrom:
_22
secretKeyRef:
_22
key: "password"
_22
name: "your-secret-name"
_22
database:
_22
value: "langflow-db"
_22
sqlite:
_22
enabled: false

Configure scaling

Scale the number of replicas and resources for both frontend and backend services:


_21
langflow:
_21
backend:
_21
replicaCount: 1
_21
resources:
_21
requests:
_21
cpu: 0.5
_21
memory: 1Gi
_21
# limits:
_21
# cpu: 0.5
_21
# memory: 1Gi
_21
_21
frontend:
_21
enabled: true
_21
replicaCount: 1
_21
resources:
_21
requests:
_21
cpu: 0.3
_21
memory: 512Mi
_21
# limits:
_21
# cpu: 0.3
_21
# memory: 512Mi

Deploy the Langflow runtime

The runtime chart is tailored for deploying applications in a production environment. It is focused on stability, performance, isolation, and security to ensure that applications run reliably and efficiently.

The langflow-runtime Helm chart is available in the Langflow Helm Charts repository.

Prerequisites

Install the Langflow runtime Helm chart

  1. Add the repository to Helm.

_10
helm repo add langflow https://langflow-ai.github.io/langflow-helm-charts
_10
helm repo update

  1. Install the Langflow app with the default options in the langflow namespace.

If you have a created a custom image with packaged flows, you can deploy Langflow by overriding the default values.yaml file with the --set flag.

  • Use a custom image with bundled flows:

_10
helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set image.repository=myuser/langflow-hello-world --set image.tag=1.0.0

  • Alternatively, install the chart and download the flows from a URL with the --set flag:

_10
helm install my-langflow-app-with-flow langflow/langflow-runtime \
_10
-n langflow \
_10
--create-namespace \
_10
--set 'downloadFlows.flows[0].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'

important

You may need to escape the square brackets in this command if you are using a shell that requires it:


_10
helm install my-langflow-app-with-flow langflow/langflow-runtime \
_10
-n langflow \
_10
--create-namespace \
_10
--set 'downloadFlows.flows\[0\].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'

  1. Check the status of the pods.

_10
kubectl get pods -n langflow

Access the Langflow app API

  1. Get your service name.

_10
kubectl get svc -n langflow

The service name is your release name followed by -langflow-runtime. For example, if you used helm install my-langflow-app-with-flow the service name is my-langflow-app-with-flow-langflow-runtime.

  1. Enable port forwarding to access Langflow from your local machine:

_10
kubectl port-forward -n langflow svc/my-langflow-app-with-flow-langflow-runtime 7860:7860

  1. Confirm you can access the API at http://localhost:7860/api/v1/flows/ and view a list of flows.

_10
curl -v http://localhost:7860/api/v1/flows/

  1. Execute the packaged flow.

The following command gets the first flow ID from the flows list and runs the flow.


_12
# Get flow ID
_12
id=$(curl -s "http://localhost:7860/api/v1/flows/" | jq -r '.[0].id')
_12
_12
# Run flow
_12
curl -X POST \
_12
"http://localhost:7860/api/v1/run/$id?stream=false" \
_12
-H 'Content-Type: application/json' \
_12
-d '{
_12
"input_value": "Hello!",
_12
"output_type": "chat",
_12
"input_type": "chat"
_12
}'

Configure secrets

To inject secrets and Langflow global variables, use the secrets and env sections in the values.yaml file.

For example, the example flow JSON uses a global variable that is a secret. When you export the flow as JSON, it's recommended to not include the secret.

Instead, when importing the flow in the Langflow runtime, you can set the global variable in one of the following ways:


_10
env:
_10
- name: openai_key_var
_10
valueFrom:
_10
secretKeyRef:
_10
name: openai-key
_10
key: openai-key

Or directly in the values file (not recommended for secret values):


_10
env:
_10
- name: openai_key_var
_10
value: "sk-...."

Configure the log level

Set the log level and other Langflow configurations in the values.yaml file.


_10
env:
_10
- name: LANGFLOW_LOG_LEVEL
_10
value: "INFO"

Configure scaling

To scale the number of replicas for the Langflow appplication, change the replicaCount value in the values.yaml file.


_10
replicaCount: 3

To scale the application vertically by increasing the resources for the pods, change the resources values in the values.yaml file.


_10
resources:
_10
requests:
_10
memory: "2Gi"
_10
cpu: "1000m"

Deploy Langflow on AWS EKS, Google GKE, or Azure AKS and other examples

For more information, see the Langflow Helm Charts repository.

Search