Kubernetes
This guide will help you get LangFlow up and running in Kubernetes cluster, including the following steps:
- Install LangFlow as IDE in a Kubernetes cluster (for development)
- Install LangFlow as a standalone application in a Kubernetes cluster (for production runtime workloads)
LangFlow (IDE)
This solution is designed to provide a complete environment for developers to create, test, and debug their flows. It includes both the API and the UI.
Prerequisites
- Kubernetes server
- kubectl
- Helm
Step 0. Prepare a Kubernetes cluster
We use Minikube for this example, but you can use any Kubernetes cluster.
-
Create a Kubernetes cluster on Minikube.
_10minikube start -
Set
kubectl
to use Minikube._10kubectl config use-context minikube
Step 1. Install the LangFlow Helm chart
-
Add the repository to Helm.
_10helm repo add langflow <https://langflow-ai.github.io/langflow-helm-charts>_10helm repo update -
Install LangFlow with the default options in the
langflow
namespace._10helm install langflow-ide langflow/langflow-ide -n langflow --create-namespace -
Check the status of the pods
_10kubectl get pods -n langflow_10NAME READY STATUS RESTARTS AGE_10langflow-0 1/1 Running 0 33s_10langflow-frontend-5d9c558dbb-g7tc9 1/1 Running 0 38s
Step 2. Access LangFlow
Enable local port forwarding to access LangFlow from your local machine.
_10kubectl port-forward -n langflow svc/langflow-langflow-runtime 7860:7860
Now you can access LangFlow at http://localhost:7860/.
LangFlow version
To specify a different LangFlow version, you can set the langflow.backend.image.tag
and langflow.frontend.image.tag
values in the values.yaml
file.
_10langflow:_10 backend:_10 image:_10 tag: "1.0.0a59"_10 frontend:_10 image:_10 tag: "1.0.0a59"
Storage
By default, the chart will use a SQLLite database stored in a local persistent disk.
If you want to use an external PostgreSQL database, you can set the langflow.database
values in the values.yaml
file.
_30# Deploy postgresql. You can skip this section if you have an existing postgresql database._30postgresql:_30 enabled: true_30 fullnameOverride: "langflow-ide-postgresql-service"_30 auth:_30 username: "langflow"_30 password: "langflow-postgres"_30 database: "langflow-db"_30_30langflow:_30 backend:_30 externalDatabase:_30 enabled: true_30 driver:_30 value: "postgresql"_30 host:_30 value: "langflow-ide-postgresql-service"_30 port:_30 value: "5432"_30 database:_30 value: "langflow-db"_30 user:_30 value: "langflow"_30 password:_30 valueFrom:_30 secretKeyRef:_30 key: "password"_30 name: "langflow-ide-postgresql-service"_30 sqlite:_30 enabled: false
Scaling
You can scale the number of replicas for the LangFlow backend and frontend services by changing the replicaCount
value in the values.yaml
file.
_10langflow:_10 backend:_10 replicaCount: 3_10 frontend:_10 replicaCount: 3
You can scale frontend and backend services independently.
To scale vertically (increase the resources for the pods), you can set the resources
values in the values.yaml
file.
_11langflow:_11 backend:_11 resources:_11 requests:_11 memory: "2Gi"_11 cpu: "1000m"_11 frontend:_11 resources:_11 requests:_11 memory: "1Gi"_11 cpu: "1000m"
Deploy on AWS EKS, Google GKE, or Azure AKS and other examples
Visit the LangFlow Helm Charts repository for more information.
LangFlow (Runtime)
The runtime chart is tailored for deploying applications in a production environment. It is focused on stability, performance, isolation, and security to ensure that applications run reliably and efficiently.
Using a dedicated deployment for a set of flows is fundamental in production environments to have granular resource control.
Prerequisites
- Kubernetes server
- kubectl
- Helm
Step 0. Prepare a Kubernetes cluster
Follow the same steps as for the LangFlow IDE.
Step 1. Install the LangFlow runtime Helm chart
-
Add the repository to Helm.
_10helm repo add langflow <https://langflow-ai.github.io/langflow-helm-charts>_10helm repo update -
Install the LangFlow app with the default options in the
langflow
namespace. If you bundled the flow in a docker image, you can specify the image name in thevalues.yaml
file or with the-set
flag: If you want to download the flow from a remote location, you can specify the URL in thevalues.yaml
file or with the-set
flag:_10helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set image.repository=myuser/langflow-just-chat --set image.tag=1.0.0_10helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set downloadFlows.flows[0].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/src/backend/base/langflow/initial_setup/starter_projects/Basic%20Prompting%20(Hello%2C%20world!).json -
Check the status of the pods.
_10kubectl get pods -n langflow
Step 2. Access the LangFlow app API
Enable local port forwarding to access LangFlow from your local machine.
_10kubectl port-forward -n langflow svc/langflow-my-langflow-app 7860:7860
Now you can access the API at http://localhost:7860/api/v1/flows and execute the flow:
_10id=$(curl -s <http://localhost:7860/api/v1/flows> | jq -r '.flows[0].id')_10curl -X POST \\_10 "<http://localhost:7860/api/v1/run/$id?stream=false>" \\_10 -H 'Content-Type: application/json'\\_10 -d '{_10 "input_value": "Hello!",_10 "output_type": "chat",_10 "input_type": "chat"_10 }'
Storage
In this case, storage is not needed as our deployment is stateless.
Log level and LangFlow configurations
You can set the log level and other LangFlow configurations in the values.yaml
file.
_10env:_10 - name: LANGFLOW_LOG_LEVEL_10 value: "INFO"
Configure secrets and variables
To inject secrets and LangFlow global variables, you can use the secrets
and env
sections in the values.yaml
file.
Let's say your flow uses a global variable which is a secret; when you export the flow as JSON, it's recommended to not include it.
When importing the flow in the LangFlow runtime, you can set the global variable using the env
section in the values.yaml
file.
Assuming you have a global variable called openai_key_var
, you can read it directly from a secret:
_10env:_10 - name: openai_key_var_10 valueFrom:_10 secretKeyRef:_10 name: openai-key_10 key: openai-key
or directly from the values file (not recommended for secret values!):
_10env:_10 - name: openai_key_var_10 value: "sk-...."
Scaling
You can scale the number of replicas for the LangFlow app by changing the replicaCount
value in the values.yaml
file.
_10replicaCount: 3
To scale vertically (increase the resources for the pods), you can set the resources
values in the values.yaml
file.
_10resources:_10 requests:_10 memory: "2Gi"_10 cpu: "1000m"
Other Examples
Visit the LangFlow Helm Charts repository for more examples and configurations. Use the default values file as reference for all the options available.
Visit the examples directory to learn more about different deployment options.