Deploy Langflow on Kubernetes
This guide demonstrates deploying Langflow on a Kubernetes cluster.
Two charts are available at the Langflow Helm Charts repository:
- Deploy the Langflow IDE for the complete Langflow development environment.
- Deploy the Langflow runtime to deploy a standalone Langflow application in a more secure and stable environment.
Deploy the Langflow IDE
The Langflow IDE deployment is a complete environment for developers to create, test, and debug their flows. It includes both the API and the UI.
The langflow-ide
Helm chart is available in the Langflow Helm Charts repository.
Prerequisites
- A Kubernetes cluster
- kubectl
- Helm
Prepare a Kubernetes cluster
This example uses Minikube, but you can use any Kubernetes cluster.
-
Create a Kubernetes cluster on Minikube.
_10minikube start -
Set
kubectl
to use Minikube._10kubectl config use-context minikube
Install the Langflow IDE Helm chart
-
Add the repository to Helm and update it.
_10helm repo add langflow https://langflow-ai.github.io/langflow-helm-charts_10helm repo update -
Install Langflow with the default options in the
langflow
namespace._10helm install langflow-ide langflow/langflow-ide -n langflow --create-namespace -
Check the status of the pods
_10kubectl get pods -n langflow_10NAME READY STATUS RESTARTS AGE_10langflow-0 1/1 Running 0 33s_10langflow-frontend-5d9c558dbb-g7tc9 1/1 Running 0 38s
Configure port forwarding to access Langflow
Enable local port forwarding to access Langflow from your local machine.
- To make the Langflow API accessible from your local machine at port 7860:
_10kubectl port-forward -n langflow svc/langflow-service-backend 7860:7860
- To make the Langflow UI accessible from your local machine at port 8080:
_10kubectl port-forward -n langflow svc/langflow-service 8080:8080
Now you can access:
- The Langflow API at
http://localhost:7860
- The Langflow UI at
http://localhost:8080
Configure the Langflow version
Langflow is deployed with the latest
version by default.
To specify a different Langflow version, set the langflow.backend.image.tag
and langflow.frontend.image.tag
values in the values.yaml file.
_10langflow:_10 backend:_10 image:_10 tag: "1.0.0a59"_10 frontend:_10 image:_10 tag: "1.0.0a59"
Configure external storage
By default, the chart deploys a SQLite database stored in a local persistent disk. If you want to use an external PostgreSQL database, you can configure it in two ways:
- Use the built-in PostgreSQL chart:
_10postgresql:_10 enabled: true_10 auth:_10 username: "langflow"_10 password: "langflow-postgres"_10 database: "langflow-db"
- Use an external database:
_22postgresql:_22 enabled: false_22_22langflow:_22 backend:_22 externalDatabase:_22 enabled: true_22 driver:_22 value: "postgresql"_22 port:_22 value: "5432"_22 user:_22 value: "langflow"_22 password:_22 valueFrom:_22 secretKeyRef:_22 key: "password"_22 name: "your-secret-name"_22 database:_22 value: "langflow-db"_22 sqlite:_22 enabled: false
Configure scaling
Scale the number of replicas and resources for both frontend and backend services:
_21langflow:_21 backend:_21 replicaCount: 1_21 resources:_21 requests:_21 cpu: 0.5_21 memory: 1Gi_21 # limits:_21 # cpu: 0.5_21 # memory: 1Gi_21_21 frontend:_21 enabled: true_21 replicaCount: 1_21 resources:_21 requests:_21 cpu: 0.3_21 memory: 512Mi_21 # limits:_21 # cpu: 0.3_21 # memory: 512Mi
Deploy the Langflow runtime
The runtime chart is tailored for deploying applications in a production environment. It is focused on stability, performance, isolation, and security to ensure that applications run reliably and efficiently.
The langflow-runtime
Helm chart is available in the Langflow Helm Charts repository.
Prerequisites
- A Kubernetes server
- kubectl
- Helm
Install the Langflow runtime Helm chart
- Add the repository to Helm.
_10helm repo add langflow https://langflow-ai.github.io/langflow-helm-charts_10helm repo update
- Install the Langflow app with the default options in the
langflow
namespace.
If you have a created a custom image with packaged flows, you can deploy Langflow by overriding the default values.yaml file with the --set
flag.
- Use a custom image with bundled flows:
_10helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set image.repository=myuser/langflow-hello-world --set image.tag=1.0.0
- Alternatively, install the chart and download the flows from a URL with the
--set
flag:
_10helm install my-langflow-app-with-flow langflow/langflow-runtime \_10 -n langflow \_10 --create-namespace \_10 --set 'downloadFlows.flows[0].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'
You may need to escape the square brackets in this command if you are using a shell that requires it:
_10helm install my-langflow-app-with-flow langflow/langflow-runtime \_10 -n langflow \_10 --create-namespace \_10 --set 'downloadFlows.flows\[0\].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'
- Check the status of the pods.
_10kubectl get pods -n langflow
Access the Langflow app API
- Get your service name.
_10kubectl get svc -n langflow
The service name is your release name followed by -langflow-runtime
. For example, if you used helm install my-langflow-app-with-flow
the service name is my-langflow-app-with-flow-langflow-runtime
.
- Enable port forwarding to access Langflow from your local machine:
_10kubectl port-forward -n langflow svc/my-langflow-app-with-flow-langflow-runtime 7860:7860
- Confirm you can access the API at
http://localhost:7860/api/v1/flows/
and view a list of flows.
_10curl -v http://localhost:7860/api/v1/flows/
- Execute the packaged flow.
The following command gets the first flow ID from the flows list and runs the flow.
_12# Get flow ID_12id=$(curl -s "http://localhost:7860/api/v1/flows/" | jq -r '.[0].id')_12_12# Run flow_12curl -X POST \_12 "http://localhost:7860/api/v1/run/$id?stream=false" \_12 -H 'Content-Type: application/json' \_12 -d '{_12 "input_value": "Hello!",_12 "output_type": "chat",_12 "input_type": "chat"_12 }'
Configure secrets
To inject secrets and Langflow global variables, use the secrets
and env
sections in the values.yaml file.
For example, the example flow JSON uses a global variable that is a secret. When you export the flow as JSON, it's recommended to not include the secret.
Instead, when importing the flow in the Langflow runtime, you can set the global variable in one of the following ways:
- Using values.yaml
- Using Helm Commands
_10env:_10 - name: openai_key_var_10 valueFrom:_10 secretKeyRef:_10 name: openai-key_10 key: openai-key
Or directly in the values file (not recommended for secret values):
_10env:_10 - name: openai_key_var_10 value: "sk-...."
- Create the secret:
_10kubectl create secret generic openai-credentials \_10 --namespace langflow \_10 --from-literal=OPENAI_API_KEY=sk...
- Verify the secret exists. The result is encrypted.
_10kubectl get secrets -n langflow openai-credentials
- Upgrade the Helm release to use the secret.
_10helm upgrade my-langflow-app-image langflow/langflow-runtime -n langflow \_10 --reuse-values \_10 --set "extraEnv[0].name=OPENAI_API_KEY" \_10 --set "extraEnv[0].valueFrom.secretKeyRef.name=openai-credentials" \_10 --set "extraEnv[0].valueFrom.secretKeyRef.key=OPENAI_API_KEY"
Configure the log level
Set the log level and other Langflow configurations in the values.yaml file.
_10env:_10 - name: LANGFLOW_LOG_LEVEL_10 value: "INFO"
Configure scaling
To scale the number of replicas for the Langflow appplication, change the replicaCount
value in the values.yaml file.
_10replicaCount: 3
To scale the application vertically by increasing the resources for the pods, change the resources
values in the values.yaml file.
_10resources:_10 requests:_10 memory: "2Gi"_10 cpu: "1000m"
Deploy Langflow on AWS EKS, Google GKE, or Azure AKS and other examples
For more information, see the Langflow Helm Charts repository.