Arize
Arize is a tool built on OpenTelemetry and OpenInference for monitoring and optimizing LLM applications.
To enable Arize tracing, set the required Arize environment variables in your Langflow deployment. Arize begins monitoring and collecting telemetry data from your LLM applications automatically.
Instructions for integrating Langflow and Arize are also available in the Arize documentation:
Prerequisites
- If you are using the standard Arize platform, you need an Arize Space ID and Arize API Key.
- If you are using the open-source Arize Phoenix platform, you need an Arize Phoenix API key.
Connect Arize to Langflow
- Arize Platform
- Arize Phoenix
-
In your Arize dashboard, copy your Space ID and API Key (Ingestion Service Account Key).
-
In the root of your Langflow application, edit your existing Langflow
.env
file or create a new one. -
Add
ARIZE_SPACE_ID
andARIZE_API_KEY
environment variables:_10ARIZE_SPACE_ID=SPACE_ID_10ARIZE_API_KEY=API_KEYReplace
SPACE_ID
andAPI_KEY
with the values you copied from the Arize platform.You do not need to specify the Arize project name if you're using the standard Arize platform.
-
Start your Langflow application with your
.env
file:_10uv run langflow run --env-file .env
-
In your Arize Phoenix dashboard, copy your API Key.
-
In the root of your Langflow application, edit your existing Langflow
.env
file or create a new one. -
Add a
PHOENIX_API_KEY
environment variable:_10PHOENIX_API_KEY=API_KEYReplace
API_KEY
with the Arize Phoenix API key that you copied from the Arize Phoenix platform. -
Start your Langflow application with your
.env
file:_10uv run langflow run --env-file .env
Run a flow and view metrics in Arize
-
In Langflow, run a flow that has an Agent or Language Model component. You must chat with the flow or trigger the LLM to produce traffic for Arize to trace.
For example, you can create a flow with the Simple Agent template, add your OpenAI API key to the Agent component, and then click Playground to chat with the flow and generate traffic.
-
In Arize, open your project dashboard, and then wait for Arize to process the data. This can take a few minutes.
-
To view metrics for your flows, go to the LLM Tracing tab.
Each Langflow execution generates two traces in Arize:
- The
AgentExecutor
trace is the Arize trace of LangChain'sAgentExecutor
. - The
UUID
trace is the trace of the Langflow components.
- The
-
To view traces, go to the Traces tab.
A trace is the complete journey of a request, made of multiple spans.
-
To view spans, go to the Spans tab.
A span is a single operation within a trace. For example, a span could be a single API call to OpenAI or a single function call to a custom tool.
For information about tracing metrics in Arize, see the Arize LLM tracing documentation.
-
To add a span to a dataset, click Add to Dataset.
All metrics on the LLM Tracing tab can be added to datasets.
-
To view a dataset, click the Datasets tab, and then select your dataset.