Deploy the Langflow production environment on Kubernetes
The Langflow runtime Helm chart is tailored for deploying applications in a production environment. It is focused on stability, performance, isolation, and security to ensure that applications run reliably and efficiently.
For security reasons, the default Langflow runtime Helm chart sets readOnlyRootFilesystem: true
. This setting prevents modifications to the container's root filesystem at runtime, which is a recommended security measure in production environments.
If readOnlyRootFilesystem
is disabled (false
), it degrades your deployment's security posture. Only disable this setting if you understand the security implications and you have implemented other security measures.
For more information, see the Kubernetes documentation.
Prerequisites
- A Kubernetes server
- kubectl
- Helm
Install the Langflow runtime Helm chart
-
Add the repository to Helm:
_10helm repo add langflow https://langflow-ai.github.io/langflow-helm-charts_10helm repo update -
Install the Langflow app with the default options in the
langflow
namespace.- Install chart with custom image
- Install chart and download flow
If you have a custom image with packaged flows, you can deploy Langflow by overriding the default
values.yaml
with the--set
flag:_10helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set image.repository=myuser/langflow-hello-world --set image.tag=1.0.0Install the chart and download flows from a URL with the
--set
flag:_10helm install my-langflow-app-with-flow langflow/langflow-runtime \_10-n langflow \_10--create-namespace \_10--set 'downloadFlows.flows[0].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'If your shell requires escaping square brackets, modify the
--set
path as needed. For example,--set 'downloadFlows.flows\[0\].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/tests/data/basic_example.json'
. -
Check the status of the pods:
_10kubectl get pods -n langflow
Access the Langflow runtime
-
Get your service name:
_10kubectl get svc -n langflowThe service name is your release name suffixed by
-langflow-runtime
. For example, if you usedhelm install my-langflow-app-with-flow
, then the service name ismy-langflow-app-with-flow-langflow-runtime
. -
Enable port forwarding to access Langflow from your local machine:
_10kubectl port-forward -n langflow svc/my-langflow-app-with-flow-langflow-runtime 7860:7860 -
Confirm you can access the API by calling
http://localhost:7860/api/v1/flows/
:_10curl -v http://localhost:7860/api/v1/flows/A successful request returns a list of flows.
-
Run a packaged flow. The following example gets the first flow ID from the flows list, and then runs the flow:
_12# Get flow ID_12id=$(curl -s "http://localhost:7860/api/v1/flows/" | jq -r '.[0].id')_12_12# Run flow_12curl -X POST \_12"http://localhost:7860/api/v1/run/$id?stream=false" \_12-H 'Content-Type: application/json' \_12-d '{_12"input_value": "Hello!",_12"output_type": "chat",_12"input_type": "chat"_12}'
Configure secrets and environment variables
Use the .env
section of the Langflow runtime Helm chart's values.yaml
file to define environment variables for your Langflow deployment.
This includes built-in Langflow environment variables, as well as global variables used by your flows.
Langflow can source global variables from your runtime environment, such as Kubernetes secrets referenced in values.yaml
.
For example, the Langflow runtime Helm chart's example flow JSON uses a global variable that is a secret.
If you want to run this flow in your Langflow deployment on Kubernetes, you need to include the secret in your runtime configuration.
When you export flows as JSON files, it's recommended to omit secrets. Whether or not a secret is included depends on how you declare the secret in your flow and whether you use the Save with my API keys option. For more information, see Import and export flows.
Set secrets
Kubernetes secrets are the recommended way to store sensitive values and credentials.
Use secretKeyRef
to reference a Kubernetes secret in values.yaml
:
_10env:_10 - name: OPENAI_API_KEY_10 valueFrom:_10 secretKeyRef:_10 name: openai-credentials_10 key: openai-key
Create and set secrets with kubectl
and helm
You can use kubectl
and helm
commands to create and set secrets:
-
Create a secret:
_10kubectl create secret generic openai-credentials \_10--namespace langflow \_10--from-literal=OPENAI_API_KEY=sk... -
Verify the secret exists:
_10kubectl get secrets -n langflow openai-credentialsThe result is encrypted.
-
Upgrade the Helm release to use the secret:
_10helm upgrade my-langflow-app-image langflow/langflow-runtime -n langflow \_10--reuse-values \_10--set "extraEnv[0].name=OPENAI_API_KEY" \_10--set "extraEnv[0].valueFrom.secretKeyRef.name=openai-credentials" \_10--set "extraEnv[0].valueFrom.secretKeyRef.key=OPENAI_API_KEY"Escape square brackets if required by your shell.
Set the log level and other configuration variables
For non-sensitive variables, such as LANGFLOW_LOG_LEVEL
, you can set the value directly in values.yaml
:
_10env:_10 - name: LANGFLOW_LOG_LEVEL_10 value: "INFO"
Configure scaling
Use replicaCount
and resources
in the Langflow runtime Helm chart's values.yaml
file to configure scaling:
-
Horizontal scaling: Use
replicaCount
to set the number of replicas for your Langflow deployment._10replicaCount: 3 -
Vertical scaling: Use the
resources
section to adjust pod resources depending on your application's needs._10resources:_10requests:_10memory: "2Gi"_10cpu: "1000m"