Integrate Arize with Langflow
Arize is a tool built on OpenTelemetry and OpenInference for monitoring and optimizing LLM applications.
To add tracing to your Langflow application, add Arize environment variables to your Langflow application. Arize begins monitoring and collecting telemetry data from your LLM applications automatically.
Prerequisites
- If you are using the standard Arize platform, you need an Arize Space ID and Arize API Key.
- If you are using the open-source Arize Phoenix platform, you need an Arize Phoenix API key.
Connect Arize to Langflow
- Arize Platform
- Arize Phoenix
- To retrieve your Arize Space ID and Arize API Key, navigate to the Arize dashboard.
- Click Settings, and then click Space Settings and Keys.
- Copy the SpaceID and API Key (Ingestion Service Account Key) values.
- Create a
.env
file in the root of your Langflow application. - Add the
ARIZE_SPACE_ID
andARIZE_API_KEY
environment variables to your Langflow application. You do not need to specify the Arize Project name if you're using the standard Arize platform. Replace the following:
- YOUR_ARIZE_SPACE_ID: the SpaceID value copied from Arize
- YOUR_ARIZE_API_KEY: the API Key value copied from Arize
_10ARIZE_SPACE_ID=YOUR_ARIZE_SPACE_ID_10ARIZE_API_KEY=YOUR_ARIZE_API_KEY
- Save the
.env
file. - Start your Langflow application with the values from the
.env
file.
_10uv run langflow run --env-file .env
- To retrieve your Arize Phoenix API key, navigate to the Arize dashboard.
- Click API Key.
- Copy the API Key value.
- Create a
.env
file in the root of your Langflow application. - Add the
PHOENIX_API_KEY
environment variable to your application instead. ReplaceYOUR_PHOENIX_API_KEY
with the Arize Phoenix API key that you copied from the Arize Phoenix platform.
_10PHOENIX_API_KEY=YOUR_PHOENIX_API_KEY
- Save the
.env
file. - Start your Langflow application with the values from the
.env
file.
_10uv run langflow run --env-file .env
For more information, see the Arize documentation.
Run a flow and view metrics in Arize
- In Langflow, select the Simple agent starter project.
- In the Agent component's OpenAI API Key field, paste your OpenAI API key.
- Click Playground. Ask your Agent some questions to generate traffic.
- Navigate to the Arize dashboard, and then open your project. You may have to wait a few minutes for Arize to process the data.
- The LLM Tracing tab shows metrics for your flow.
Each Langflow execution generates two traces in Arize.
The
AgentExecutor
trace is the Arize trace of Langchain'sAgentExecutor
. The UUID trace is the trace of the Langflow components. - To view traces, click the Traces tab. A trace is the complete journey of a request, made of multiple spans.
- To view Spans, select the Spans tab. A span is a single operation within a trace. For example, a span could be a single API call to OpenAI or a single function call to a custom tool. For more on traces, spans, and other metrics in Arize, see the Arize documentation.
- All metrics in the LLM Tracing tab can be added to Datasets. To add a span to a Dataset, click the Add to Dataset button.
- To view a Dataset, click the Datasets tab, and then select your Dataset. For more on Datasets, see the Arize documentation.