Memory management options
Langflow provides flexible memory management options for storage and retrieval of data relevant to your flows and your Langflow server. This includes essential Langflow database tables, file management, and caching, as well as chat memory.
Storage options and paths
Langflow supports both local memory and external memory options.
Langflow's default storage option is a SQLite database stored in your system's cache directory.
The default storage path depends on your operation system and installation method:
- macOS Desktop:
/Users/<username>/.langflow/data/database.db
- Windows Desktop:
C:\Users\<name>\AppData\Roaming\com.Langflow\data\langflow.db
- OSS macOS/Windows/Linux/WSL (
uv pip install
):<path_to_venv>/lib/python3.12/site-packages/langflow/langflow.db
(Python version may vary) - OSS macOS/Windows/Linux/WSL (
git clone
):<path_to_clone>/src/backend/base/langflow/langflow.db
Alternatively, you can use an external PostgreSQL database for all of your Langflow storage. You can also selectively use external storage for chat memory, separate from other Langflow storage. For more information, see Configure external memory and Store chat memory.
Local Langflow database tables
The following tables are stored in langflow.db
:
• ApiKey: Manages Langflow API authentication keys. Component API keys are stored in the Variables table. For more information, see API keys and authentication.
• File: Stores metadata for files uploaded to Langflow's file management system, including file names, paths, sizes, and storage providers. For more information, see Manage files.
• Flow: Contains flow definitions, including nodes, edges, and components, stored as JSON or database records. For more information, see Build flows.
• Folder: Provides a structure for flow storage, including single-user folders and shared folders accessed by multiple users. For more information, see Manage flows in projects.
• Message: Stores chat messages and interactions that occur between components. For more information, see Message objects and Store chat memory.
• Transactions: Records execution history and results of flow runs. This information is used for logging.
• User: Stores user account information including credentials, permissions, profiles, and user management settings. For more information, see API keys and authentication.
• Variables: Stores global encrypted values and credentials. For more information, see Global variables.
• VertexBuild: Tracks the build status of individual nodes within flows. For more information, see Test flows in the Playground.
For more information, see the database models in the source code.
Configure external memory
To replace the default Langflow SQLite database with another database, modify the LANGFLOW_DATABASE_URL
environment variable, and then start Langflow with your .env
file:
_10LANGFLOW_DATABASE_URL=postgresql://user:password@localhost:5432/langflow
For an example, see Configure an external PostgreSQL database.
The LANGFLOW_DB_CONNECTION_SETTINGS
is a JSON configuration for database connection pool settings that allows you to fine-tune your database connection pool and timeout settings.
_10LANGFLOW_DB_CONNECTION_SETTINGS='{"pool_size": 20, "max_overflow": 30, "pool_timeout": 30, "pool_pre_ping": true, "pool_recycle": 1800, "echo": false}'
Connection pool parameters
pool_size
: Maximum number of database connections to keep in the pool. Default: 20 connections.max_overflow
: Maximum number of connections that can be created beyond the pool_size. Default: 30 connections.pool_timeout
: Number of seconds to wait before timing out on getting a connection from the pool. Default: 30 seconds.pool_pre_ping
: Iftrue
, the pool tests connections for liveness upon each checkout. Default:true
.pool_recycle
: Number of seconds after which a connection is automatically recycled. Default: 1800 seconds (30 minutes).echo
: Iftrue
, SQL queries are logged for debugging purposes. Default:false
.
Configure cache memory
The default Langflow caching behavior is an asynchronous, in-memory cache.
_10LANGFLOW_LANGCHAIN_CACHE=InMemoryCache_10LANGFLOW_CACHE_TYPE=async
Langflow officially supports only the default asynchronous, in-memory cache. Other backends are experimental and may change without notice. The default behavior is suitable for most use cases.
Redis and other external cache settings are experimental and not officially supported.
Cache environment variables
Variable | Type | Default | Description |
---|---|---|---|
LANGFLOW_CACHE_TYPE | String | async | Set the cache type for Langflow's internal caching system. |
LANGFLOW_LANGCHAIN_CACHE | String | InMemoryCache | Set the cache type for Langchain's caching system. |
LANGFLOW_REDIS_HOST | String | localhost | Redis server hostname. |
LANGFLOW_REDIS_PORT | Integer | 6379 | Redis server port. |
LANGFLOW_REDIS_DB | Integer | 0 | Redis database number. |
LANGFLOW_REDIS_CACHE_EXPIRE | Integer | 3600 | Cache expiration time in seconds. |
LANGFLOW_REDIS_PASSWORD | String | Not set | Redis authentication password (optional). |
Store chat memory
Chat-based flows with a Chat Input or Chat Output component produce chat history that is stored in the Langflow messages
table.
At minimum, this serves as a chat log, but it isn't functionally the same as chat memory that provides historical context to an LLM.
To store and retrieve chat memories in flows, you can use a Message History component or the Agent component's built-in chat memory.
How does chat memory work?
Chat memory is a cache for an LLM or agent to preserve past conversations to retain and reference that context in future interactions. For example, if a user has already told the LLM their name, the LLM can retrieve that information from chat memory rather than asking the user to repeat themselves in future conversations or messages.
Chat memory is distinct from vector store memory because it is built specifically for storing and retrieving chat messages from databases.
Components that support chat memory (such as the Agent and Message History components) provide access to their respective databases as memory. Retrieval as memory is an important distinction for LLMs and agents because this storage and retrieval mechanism is specifically designed to recall context from past conversations. Unlike vector stores, which are designed for semantic search and retrieval of text chunks, chat memory is designed to store and retrieve chat messages in a way that is optimized for conversation history.
Session ID and chat memory
Chat history and memories are grouped by session ID (session_id
).
The default session ID is the flow ID, which means that all chat messages for a flow are stored under the same session ID as one large chat session.
For better segregation of chat memory, especially in flows used by multiple users, consider using custom session IDs. For example, if you use user IDs as session IDs, then each user's chat history is stored separately, isolating the context of their chats from other users' chats.
Chat memory options
Where and how chat memory is stored depends on the components used in your flow:
-
Agent component: This component has built-in chat memory that is enabled by default. This memory allows the agent to retrieve and reference messages from previous conversations associated with the same session ID. All messages are stored in Langflow storage, and the component provides minimal memory configuration options, such as the number of messages to retrieve.
The Agent component's built-in chat memory is sufficient for most use cases.
If you want to use external chat memory storage, retrieve memories outside the context of a chat, or use chat memory with a language model component (not an agent), you must use the Message History component (with or without a third-party chat memory component).
-
Message History component: By default, this component stores and retrieves memories from Langflow storage, unless you attach a third-party chat memory component. It provides a few more options for sorting and filtering memories, although most of these options are also built-in to the Agent component as configurable or fixed parameters.
You can use the Message History component with or without a language model or agent. For example, if you need to retrieve data from memories outside of chat, you can use the Message History component to fetch that data directly from your chat memory database without feeding it into a chat.
-
Third-party chat memory components: Use one of these components only if you need to store or retrieve chat memories from a dedicated external chat memory database. Typically, this is necessary only if you have specific storage needs that aren't met by Langflow storage. For example, if you want to manage chat memory data by directly working with the database, or if you want to use a different database than the default Langflow storage.
For more information and examples, see Message History component and Agent memory.