Memory management options
Langflow provides flexible memory management options for storage and retrieval of data relevant to your flows and your Langflow server. This includes essential Langflow database tables, file management, and caching, as well as chat memory.
Storage options and paths
Langflow supports both local memory and external memory options.
Langflow's default storage option is a SQLite database stored in your system's cache directory. The default storage path depends on your operation system and installation method:
- macOS Desktop:
/Users/<username>/.langflow/data/database.db
- Windows Desktop:
C:\Users\<name>\AppData\Roaming\com.Langflow\data\langflow.db
- OSS macOS/Windows/Linux/WSL (
uv pip install
):<path_to_venv>/lib/python3.12/site-packages/langflow/langflow.db
(Python version may vary) - OSS macOS/Windows/Linux/WSL (
git clone
):<path_to_clone>/src/backend/base/langflow/langflow.db
Alternatively, you can use an external PostgreSQL database for all of your Langflow storage. You can also selectively use external storage for chat memory, separate from other Langflow storage. For more information, see Configure external memory and Store chat memory.
Local Langflow database tables
The following tables are stored in langflow.db
:
• User: Stores user account information including credentials, permissions, profiles, and user management settings. For more information, see API keys and authentication.
• Flow: Contains flow definitions, including nodes, edges, and components, stored as JSON or database records. For more information, see Build flows.
• Message: Stores chat messages and interactions that occur between components. For more information, see Message objects and Store chat memory.
• Transaction: Records execution history and results of flow runs. This information is used for logging.
• ApiKey: Manages Langflow API authentication keys. Component API keys are stored in the Variables table. For more information, see API keys and authentication.
• Project: Provides a structure for flow storage, including single-user projects and shared projects accessed by multiple users. For more information, see Manage flows in projects.
• Variables: Stores global encrypted values and credentials. For more information, see Global variables.
• VertexBuild: Tracks the build status of individual nodes within flows. For more information, see Test flows in the Playground.
For more information, see the database models in the source code.
Configure external memory
To replace the default Langflow SQLite database with another database, modify the LANGFLOW_DATABASE_URL
environment variable, and then start Langflow with your .env
file:
_10LANGFLOW_DATABASE_URL=postgresql://user:password@localhost:5432/langflow
For an example, see Configure an external PostgreSQL database.
Configure the external database connection
The following settings allow you to fine-tune your database connection pool and timeout settings:
_10LANGFLOW_DB_CONNECTION_SETTINGS='{"pool_size": 20, "max_overflow": 30, "pool_timeout": 30, "pool_pre_ping": true, "pool_recycle": 1800, "echo": false}'_10LANGFLOW_DB_CONNECT_TIMEOUT=20
pool_size
: Maximum number of database connections to keep in the pool (default: 20)max_overflow
: Maximum number of connections that can be created beyond the pool_size (default: 30)pool_timeout
: Number of seconds to wait before timing out on getting a connection from the pool (default: 30)pool_pre_ping
: If true, the pool tests connections for liveness upon each checkout (default: true)pool_recycle
: Number of seconds after which a connection is automatically recycled (default: 1800, or 30 minutes)echo
: If true, SQL queries are logged for debugging purposes (default: false)LANGFLOW_DB_CONNECT_TIMEOUT
: Maximum number of seconds to wait when establishing a new database connection (default: 20)
Configure cache memory
The default Langflow caching behavior is an asynchronous, in-memory cache.
_10LANGFLOW_LANGCHAIN_CACHE=InMemoryCache_10LANGFLOW_CACHE_TYPE=Async
Alternative caching options can be configured, but options other than the default asynchronous, in-memory cache aren't supported. The default behavior is suitable for most use cases.
For other options, see the LANGFLOW_CACHE_TYPE
environment variable.
Store chat memory
Chat-based flows with a Language Model or Agent component have built-in chat memory that is enabled by default. This memory allows them to retrieve and reference messages from previous conversations associated with the same session ID.
Built-in chat memory stores memories in the Langflow messages
table.
How does chat memory work?
Chat memory is a cache for the LLM or agent to preserve past conversations to retain and reference that context in future interactions. For example, if a user has already told the LLM their name, the LLM can retrieve that information from chat memory rather than asking the user to repeat themselves in future conversations or messages.
Chat memory is distinct from vector store memory because it is built specifically for storing and retrieving chat messages from databases.
Components that support chat memory (such as the Agent, Language Model, Message History, or third-party Chat Memory components) provide access to their respective databases as memory. Retrieval as memory is an important distinction for LLMs and agents because this storage and retrieval mechanism is specifically designed to recall context from past conversations. Unlike vector stores, which are designed for semantic search and retrieval of text chunks, chat memory is designed to store and retrieve chat messages in a way that is optimized for conversation history.
Session ID and chat memory
Chat memories are grouped by session ID (session_id
).
The default session ID is the flow ID, which means that all chat messages for a flow are stored under the same session ID as one large chat session.
For better segregation of chat memory, especially in flows used by multiple users, consider using custom session IDs. For example, if you use user IDs as session IDs, then each user's chat history is stored separately, isolating the context of their chats from other users' chats.
Chat memory options
Where and how chat memory is stored depends on the components used in your flow:
-
Language Model and Agent components: All messages are stored in Langflow storage. The Agent component provides some memory configuration options, such as Number of Chat History Messages.
The Language Model and Agent component's built-in chat memory are sufficient for most use cases.
If you prefer to use dedicated, external chat memory storage, or you need to retrieve memories outside the context of a chat, you can add Message History and Chat Memory components to your flow.
-
Message History component: By default, this component stores and retrieves memories from Langflow storage, unless you attach a Chat Memory component. It provides a few more options for sorting and filtering memories, although most of these options are built-in to the Agent component as configurable or fixed parameters.
You can use the Message History component with or without a Language Model or Agent component. For example, if you need to retrieve data from memories outside of chat, you can use the Message History component to fetch that data directly from your chat memory database without feeding it into a chat.
-
Third-party Chat Memory components: Use one of these components only if you need to specifically store or retrieve chat memories from a dedicated chat memory database. Typically, this is necessary only if you have specific storage needs that aren't met by Langflow storage. For example, if you want to manage chat memory data by directly working with the database, or if you want to use a different database than the default Langflow storage.
For more information and examples, see Message History component and Agent memory.