Skip to main content

Kubernetes

This guide will help you get LangFlow up and running in Kubernetes cluster, including the following steps:

LangFlow (IDE)


This solution is designed to provide a complete environment for developers to create, test, and debug their flows. It includes both the API and the UI.

Prerequisites

  • Kubernetes server
  • kubectl
  • Helm

Step 0. Prepare a Kubernetes cluster

We use Minikube for this example, but you can use any Kubernetes cluster.

  1. Create a Kubernetes cluster on Minikube.


    _10
    minikube start

  2. Set kubectl to use Minikube.


    _10
    kubectl config use-context minikube

Step 1. Install the LangFlow Helm chart

  1. Add the repository to Helm.


    _10
    helm repo add langflow <https://langflow-ai.github.io/langflow-helm-charts>
    _10
    helm repo update

  2. Install LangFlow with the default options in the langflow namespace.


    _10
    helm install langflow-ide langflow/langflow-ide -n langflow --create-namespace

  3. Check the status of the pods


    _10
    kubectl get pods -n langflow


    _10
    NAME READY STATUS RESTARTS AGE
    _10
    langflow-0 1/1 Running 0 33s
    _10
    langflow-frontend-5d9c558dbb-g7tc9 1/1 Running 0 38s

Step 2. Access LangFlow

Enable local port forwarding to access LangFlow from your local machine.


_10
kubectl port-forward -n langflow svc/langflow-langflow-runtime 7860:7860

Now you can access LangFlow at http://localhost:7860/.

LangFlow version

To specify a different LangFlow version, you can set the langflow.backend.image.tag and langflow.frontend.image.tag values in the values.yaml file.


_10
langflow:
_10
backend:
_10
image:
_10
tag: "1.0.0a59"
_10
frontend:
_10
image:
_10
tag: "1.0.0a59"

Storage

By default, the chart will use a SQLLite database stored in a local persistent disk. If you want to use an external PostgreSQL database, you can set the langflow.database values in the values.yaml file.


_30
# Deploy postgresql. You can skip this section if you have an existing postgresql database.
_30
postgresql:
_30
enabled: true
_30
fullnameOverride: "langflow-ide-postgresql-service"
_30
auth:
_30
username: "langflow"
_30
password: "langflow-postgres"
_30
database: "langflow-db"
_30
_30
langflow:
_30
backend:
_30
externalDatabase:
_30
enabled: true
_30
driver:
_30
value: "postgresql"
_30
host:
_30
value: "langflow-ide-postgresql-service"
_30
port:
_30
value: "5432"
_30
database:
_30
value: "langflow-db"
_30
user:
_30
value: "langflow"
_30
password:
_30
valueFrom:
_30
secretKeyRef:
_30
key: "password"
_30
name: "langflow-ide-postgresql-service"
_30
sqlite:
_30
enabled: false

Scaling

You can scale the number of replicas for the LangFlow backend and frontend services by changing the replicaCount value in the values.yaml file.


_10
langflow:
_10
backend:
_10
replicaCount: 3
_10
frontend:
_10
replicaCount: 3

You can scale frontend and backend services independently.

To scale vertically (increase the resources for the pods), you can set the resources values in the values.yaml file.


_11
langflow:
_11
backend:
_11
resources:
_11
requests:
_11
memory: "2Gi"
_11
cpu: "1000m"
_11
frontend:
_11
resources:
_11
requests:
_11
memory: "1Gi"
_11
cpu: "1000m"

Deploy on AWS EKS, Google GKE, or Azure AKS and other examples

Visit the LangFlow Helm Charts repository for more information.

LangFlow (Runtime)


The runtime chart is tailored for deploying applications in a production environment. It is focused on stability, performance, isolation, and security to ensure that applications run reliably and efficiently.

Using a dedicated deployment for a set of flows is fundamental in production environments to have granular resource control.

Prerequisites

  • Kubernetes server
  • kubectl
  • Helm

Step 0. Prepare a Kubernetes cluster

Follow the same steps as for the LangFlow IDE.

Step 1. Install the LangFlow runtime Helm chart

  1. Add the repository to Helm.


    _10
    helm repo add langflow <https://langflow-ai.github.io/langflow-helm-charts>
    _10
    helm repo update

  2. Install the LangFlow app with the default options in the langflow namespace. If you bundled the flow in a docker image, you can specify the image name in the values.yaml file or with the -set flag: If you want to download the flow from a remote location, you can specify the URL in the values.yaml file or with the -set flag:


    _10
    helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set image.repository=myuser/langflow-just-chat --set image.tag=1.0.0


    _10
    helm install my-langflow-app langflow/langflow-runtime -n langflow --create-namespace --set downloadFlows.flows[0].url=https://raw.githubusercontent.com/langflow-ai/langflow/dev/src/backend/base/langflow/initial_setup/starter_projects/Basic%20Prompting%20(Hello%2C%20world!).json

  3. Check the status of the pods.


    _10
    kubectl get pods -n langflow

Step 2. Access the LangFlow app API

Enable local port forwarding to access LangFlow from your local machine.


_10
kubectl port-forward -n langflow svc/langflow-my-langflow-app 7860:7860

Now you can access the API at http://localhost:7860/api/v1/flows and execute the flow:


_10
id=$(curl -s <http://localhost:7860/api/v1/flows> | jq -r '.flows[0].id')
_10
curl -X POST \\
_10
"<http://localhost:7860/api/v1/run/$id?stream=false>" \\
_10
-H 'Content-Type: application/json'\\
_10
-d '{
_10
"input_value": "Hello!",
_10
"output_type": "chat",
_10
"input_type": "chat"
_10
}'

Storage

In this case, storage is not needed as our deployment is stateless.

Log level and LangFlow configurations

You can set the log level and other LangFlow configurations in the values.yaml file.


_10
env:
_10
- name: LANGFLOW_LOG_LEVEL
_10
value: "INFO"

Configure secrets and variables

To inject secrets and LangFlow global variables, you can use the secrets and env sections in the values.yaml file.

Let's say your flow uses a global variable which is a secret; when you export the flow as JSON, it's recommended to not include it. When importing the flow in the LangFlow runtime, you can set the global variable using the env section in the values.yaml file. Assuming you have a global variable called openai_key_var, you can read it directly from a secret:


_10
env:
_10
- name: openai_key_var
_10
valueFrom:
_10
secretKeyRef:
_10
name: openai-key
_10
key: openai-key

or directly from the values file (not recommended for secret values!):


_10
env:
_10
- name: openai_key_var
_10
value: "sk-...."

Scaling

You can scale the number of replicas for the LangFlow app by changing the replicaCount value in the values.yaml file.


_10
replicaCount: 3

To scale vertically (increase the resources for the pods), you can set the resources values in the values.yaml file.


_10
resources:
_10
requests:
_10
memory: "2Gi"
_10
cpu: "1000m"

Other Examples


Visit the LangFlow Helm Charts repository for more examples and configurations. Use the default values file as reference for all the options available.

note

Visit the examples directory to learn more about different deployment options.

Hi, how can I help you?