Skip to main content

Langfuse

Langfuse is an open-source platform for LLM observability. It provides tracing and monitoring capabilities for AI applications, helping developers debug, analyze, and optimize their AI systems. Langfuse integrates with various tools and frameworks, including workflow builders and runtimes like Langflow.

This guide explains how to configure Langflow to collect tracing data about your flow executions and automatically send the data to Langfuse.

Prerequisites

tip

If you need a flow to test the Langfuse integration, see the Langflow quickstart.

Set Langfuse credentials as environment variables

  1. Create a set of Langfuse API keys.

  2. Copy the following API key information:

    • Secret Key
    • Public Key
    • Host URL
  3. Set your Langfuse project credentials as environment variables in the same environment where you run Langflow.

    In the following examples, replace SECRET_KEY, PUBLIC_KEY, and HOST_URL with your API key details from Langfuse.

    These commands set the environment variables in a Linux or macOS terminal session:


    _10
    export LANGFUSE_SECRET_KEY=SECRET_KEY
    _10
    export LANGFUSE_PUBLIC_KEY=PUBLIC_KEY
    _10
    export LANGFUSE_HOST=HOST_URL

Start Langflow and view traces in Langfuse

  1. Start Langflow in the same environment where you set the Langfuse environment variables:


    _10
    uv run langflow run

  2. Run a flow.

    Langflow automatically collects and sends tracing data about the flow execution to Langfuse.

  3. View the collected data in your Langfuse dashboard.

    Langfuse also provides a public live trace example dashboard.

Disable Langfuse tracing

To disable the Langfuse integration, remove the Langfuse environment variables, and then restart Langflow.

Run Langfuse and Langflow with Docker Compose

As an alternative to the previous setup, particularly for self-hosted Langfuse, you can run both services with Docker Compose.

  1. Create a set of Langfuse API keys.

  2. Copy the following API key information:

    • Secret Key
    • Public Key
    • Host URL
  3. Add your Langflow credentials to your Langflow docker-compose.yml file in the environment section.

    The following example is based on the example docker-compose.yml.


    _32
    services:
    _32
    langflow:
    _32
    image: langflowai/langflow:latest # or another version tag on https://hub.docker.com/r/langflowai/langflow
    _32
    pull_policy: always # set to 'always' when using 'latest' image
    _32
    ports:
    _32
    - "7860:7860"
    _32
    depends_on:
    _32
    - postgres
    _32
    environment:
    _32
    - LANGFLOW_DATABASE_URL=postgresql://langflow:langflow@postgres:5432/langflow
    _32
    # This variable defines where the logs, file storage, monitor data and secret keys are stored.
    _32
    - LANGFLOW_CONFIG_DIR=app/langflow
    _32
    - LANGFUSE_SECRET_KEY=sk-...
    _32
    - LANGFUSE_PUBLIC_KEY=pk-...
    _32
    - LANGFUSE_HOST=https://us.cloud.langfuse.com
    _32
    volumes:
    _32
    - langflow-data:/app/langflow
    _32
    _32
    postgres:
    _32
    image: postgres:16
    _32
    environment:
    _32
    POSTGRES_USER: langflow
    _32
    POSTGRES_PASSWORD: langflow
    _32
    POSTGRES_DB: langflow
    _32
    ports:
    _32
    - "5432:5432"
    _32
    volumes:
    _32
    - langflow-postgres:/var/lib/postgresql/data
    _32
    _32
    volumes:
    _32
    langflow-postgres:
    _32
    langflow-data:

  4. Start the Docker container:


    _10
    docker-compose up

  5. To confirm Langfuse is connected to your Langflow container, run the following command:


    _10
    docker compose exec langflow python -c "import requests, os; addr = os.environ.get('LANGFUSE_HOST'); print(addr); res = requests.get(addr, timeout=5); print(res.status_code)"

    If there is an error, make sure you have set the LANGFUSE_HOST environment variable in your terminal session.

    Output similar to the following indicates success:


    _10
    https://us.cloud.langfuse.com
    _10
    200

See also

Search