Openlayer
Openlayer is a testing and evaluation platform for LLM applications. It provides comprehensive observability, testing, and monitoring capabilities to help you ship high-quality AI systems with confidence.
You can configure Langflow to collect tracing data about your flow executions and automatically send the data to Openlayer for analysis, monitoring, and evaluation.
Prerequisites
- An Openlayer account
- A running Langflow server with a flow that you want to trace
- An Openlayer inference pipeline
If you need a flow to test the Openlayer integration, see the Langflow quickstart.
Set Openlayer credentials as environment variables
-
Get your Openlayer API key from your Openlayer account.
-
Create an inference pipeline in Openlayer and copy the pipeline ID.
-
Set your Openlayer credentials as environment variables in the same environment where you run Langflow.
In the following examples, replace
YOUR_API_KEYandYOUR_PIPELINE_IDwith your actual Openlayer credentials.- Linux or macOS
- Windows
These commands set the environment variables in a Linux or macOS terminal session:
_10export OPENLAYER_API_KEY="YOUR_API_KEY"_10export OPENLAYER_INFERENCE_PIPELINE_ID="YOUR_PIPELINE_ID"These commands set the environment variables in a Windows command prompt session:
_10set OPENLAYER_API_KEY=YOUR_API_KEY_10set OPENLAYER_INFERENCE_PIPELINE_ID=YOUR_PIPELINE_ID
Start Langflow and view traces in Openlayer
-
Start Langflow in the same environment where you set the Openlayer environment variables:
_10uv run langflow run -
Run a flow in Langflow.
Langflow automatically collects and sends tracing data about the flow execution to Openlayer, including:
- Component inputs and outputs
- Execution timing and latency
- LLM calls and nested operations
- User and session context
-
View the collected data in your Openlayer dashboard.
Each flow execution appears as a trace with a hierarchical view of all components and their nested operations.
Advanced configuration
Flow-specific pipelines
You can configure different Openlayer inference pipelines for different flows using flow-specific environment variables:
_10export OPENLAYER_PIPELINE_MY_FLOW_NAME="pipeline-id-1"_10export OPENLAYER_PIPELINE_ANOTHER_FLOW="pipeline-id-2"
The flow name is converted to uppercase and non-alphanumeric characters are replaced with underscores. For example, "My Flow-Name" becomes OPENLAYER_PIPELINE_MY_FLOW_NAME.
JSON mapping
Alternatively, you can use a JSON mapping to configure multiple flows at once:
- Linux or macOS
- Windows
_10export OPENLAYER_LANGFLOW_MAPPING='{"Flow Name 1":"pipeline-id-1","Flow Name 2":"pipeline-id-2"}'
_10set OPENLAYER_LANGFLOW_MAPPING={"Flow Name 1":"pipeline-id-1","Flow Name 2":"pipeline-id-2"}
Configuration priority
Openlayer configuration is resolved in the following order (highest priority first):
- Flow-specific environment variable:
OPENLAYER_PIPELINE_<FLOW_NAME> - JSON mapping:
OPENLAYER_LANGFLOW_MAPPING - Default environment variable:
OPENLAYER_INFERENCE_PIPELINE_ID
This allows you to set a default pipeline for all flows and override it for specific flows as needed.
Disable Openlayer tracing
To disable the Openlayer integration, remove the OPENLAYER_API_KEY environment variable, and then restart Langflow.
Features
The Openlayer integration automatically captures:
- Component hierarchy: All flow components with parent-child relationships
- LangChain callbacks: Nested LLM calls and tool executions appear within their parent components
- Timing metrics: Start time, end time, and latency for each component
- Inputs and outputs: Component inputs and outputs with automatic type conversion
- User context: User ID and session ID propagation for better analytics
- Error tracking: Errors and logs captured in component metadata