Skip to main content

About bundles

Bundles contain custom components that support specific third-party integrations with Langflow. You add them to your flows and configure them in the same way as Langflow's core components.

To browse bundles, click Bundles in the visual editor.

Bundle maintenance and documentation

Many bundled components are developed by third-party contributors to the Langflow codebase.

Some providers contribute documentation with their bundles, whereas others document their bundles in their own documentation. Some bundles have no documentation.

To find documentation for a specific bundled component, browse the Langflow docs and your provider's documentation. If available, you can also find links to relevant documentation, such as API endpoints, through the component itself:

  1. Click the component to expose the component's header menu.
  2. Click More.
  3. Select Docs.

The Langflow documentation focuses on using bundles within flows. For that reason, it focuses on the Langflow-specific configuration steps for bundled components. For information about provider-specific features or APIs, see the provider's documentation.

Component parameters

Some parameters are hidden by default in the visual editor. You can modify all parameters through the Controls in the component's header menu.

Core components and bundles

tip

The Langflow documentation doesn't list all bundles or components in bundles. For the most accurate and up-to-date list of bundles and components for your version of Langflow, check Bundles in the visual editor.

If you can't find a component that you used in an earlier version of Langflow, it may have been removed or marked as a legacy component.

Langflow offers generic Core components in addition to third-party, provider-specific bundles.

If you are looking for a specific service or integration, you can Search components in the visual editor.

If all else fails, you can always create your own custom components.

Legacy bundles

Legacy components are longer supported and can be removed in a future release. You can continue to use them in existing flows, but it is recommended that you replace them with supported components as soon as possible. Suggested replacements are included in the Legacy banner on components in your flows. They are also given in release notes and Langflow documentation whenever possible.

If you aren't sure how to replace a legacy component, Search for components by provider, service, or component name. The component may have been deprecated in favor of a completely new component, a similar component, or a new version of the same component in a different category.

If there is no obvious replacement, consider whether another component can be adapted to your use case. For example, many Core components provide generic functionality that can support multiple providers and use cases, such as the API Request component.

If neither of these options are viable, you could use the legacy component's code to create your own custom component, or start a discussion about the legacy component.

To discourage use of legacy components in new flows, these components are hidden by default. In the visual editor, you can click Component settings to toggle the Legacy filter.

The following bundles include only legacy components.

CrewAI bundle

Replace the following legacy CrewAI components with other agentic components, such as the Agent component.

CrewAI Agent

This component represents CrewAI agents, allowing for the creation of specialized AI agents with defined roles goals and capabilities within a crew. For more information, see the CrewAI agents documentation.

This component accepts the following parameters:

NameDisplay NameInfo
roleRoleInput parameter. The role of the agent.
goalGoalInput parameter. The objective of the agent.
backstoryBackstoryInput parameter. The backstory of the agent.
toolsToolsInput parameter. The tools at the agent's disposal.
llmLanguage ModelInput parameter. The language model that runs the agent.
memoryMemoryInput parameter. This determines whether the agent should have memory or not.
verboseVerboseInput parameter. This enables verbose output.
allow_delegationAllow DelegationInput parameter. This determines whether the agent is allowed to delegate tasks to other agents.
allow_code_executionAllow Code ExecutionInput parameter. This determines whether the agent is allowed to execute code.
kwargskwargsInput parameter. Additional keyword arguments for the agent.
outputAgentOutput parameter. The constructed CrewAI Agent object.
CrewAI Hierarchical Crew, CrewAI Hierarchical Task

The CrewAI Hierarchical Crew component represents a group of agents managing how they should collaborate and the tasks they should perform in a hierarchical structure. This component allows for the creation of a crew with a manager overseeing the task execution. For more information, see the CrewAI hierarchical crew documentation.

It accepts the following parameters:

NameDisplay NameInfo
agentsAgentsInput parameter. The list of Agent objects representing the crew members.
tasksTasksInput parameter. The list of HierarchicalTask objects representing the tasks to be executed.
manager_llmManager LLMInput parameter. The language model for the manager agent.
manager_agentManager AgentInput parameter. The specific agent to act as the manager.
verboseVerboseInput parameter. This enables verbose output for detailed logging.
memoryMemoryInput parameter. The memory configuration for the crew.
use_cacheUse CacheInput parameter. This enables caching of results.
max_rpmMax RPMInput parameter. This sets the maximum requests per minute.
share_crewShare CrewInput parameter. This determines if the crew information is shared among agents.
function_calling_llmFunction Calling LLMInput parameter. The language model for function calling.
crewCrewOutput parameter. The constructed Crew object with hierarchical task execution.
CrewAI Sequential Crew, CrewAI Sequential Task

The CrewAI Sequential Crew component represents a group of agents with tasks that are executed sequentially. This component allows for the creation of a crew that performs tasks in a specific order. For more information, see the CrewAI sequential crew documentation.

It accepts the following parameters:

NameDisplay NameInfo
tasksTasksInput parameter. The list of SequentialTask objects representing the tasks to be executed.
verboseVerboseInput parameter. This enables verbose output for detailed logging.
memoryMemoryInput parameter. The memory configuration for the crew.
use_cacheUse CacheInput parameter. This enables caching of results.
max_rpmMax RPMInput parameter. This sets the maximum requests per minute.
share_crewShare CrewInput parameter. This determines if the crew information is shared among agents.
function_calling_llmFunction Calling LLMInput parameter. The language model for function calling.
crewCrewOutput parameter. The constructed Crew object with sequential task execution.
CrewAI Sequential Task Agent

This component creates a CrewAI Task and its associated agent allowing for the definition of sequential tasks with specific agent roles and capabilities. For more information, see the CrewAI sequential agents documentation.

It accepts the following parameters:

NameDisplay NameInfo
roleRoleInput parameter. The role of the agent.
goalGoalInput parameter. The objective of the agent.
backstoryBackstoryInput parameter. The backstory of the agent.
toolsToolsInput parameter. The tools at the agent's disposal.
llmLanguage ModelInput parameter. The language model that runs the agent.
memoryMemoryInput parameter. This determines whether the agent should have memory or not.
verboseVerboseInput parameter. This enables verbose output.
allow_delegationAllow DelegationInput parameter. This determines whether the agent is allowed to delegate tasks to other agents.
allow_code_executionAllow Code ExecutionInput parameter. This determines whether the agent is allowed to execute code.
agent_kwargsAgent kwargsInput parameter. The additional kwargs for the agent.
task_descriptionTask DescriptionInput parameter. The descriptive text detailing the task's purpose and execution.
expected_outputExpected Task OutputInput parameter. The clear definition of the expected task outcome.
async_executionAsync ExecutionInput parameter. Boolean flag indicating asynchronous task execution.
previous_taskPrevious TaskInput parameter. The previous task in the sequence for chaining.
task_outputSequential TaskOutput parameter. The list of SequentialTask objects representing the created tasks.

Embeddings bundle

  • Embedding Similarity: Replaced by built-in similarity search functionality in vector store components.
  • Text Embedder: Replaced by the embedding model components.

Vector Stores bundle

This bundle contains only the legacy Local DB component. All other vector store components can be found within their respective provider-specific bundles, such as the DataStax bundle.

Local DB

Replace the Local DB component with the Chroma DB vector store component (in the Chroma bundle) or another vector store component.

The Local DB component reads and writes to a persistent, in-memory Chroma DB instance intended for use with Langflow. It has separate modes for reads and writes, automatic collection management, and default persistence in your Langflow cache directory.

Set the Mode parameter to reflect the operation you want the component to perform, and then configure the other parameters accordingly. Some parameters are only available for one mode.

To create or write to your local Chroma vector store, use Ingest mode.

The following parameters are available in Ingest mode:

NameTypeDescription
Name Your Collection (collection_name)StringInput parameter. The name for your Chroma vector store collection. Default: langflow. Only available in Ingest mode.
Persist Directory (persist_directory)StringInput parameter. The base directory where you want to create and persist the vector store. If you use the Local DB component in multiple flows or to create multiple collections, collections are stored at $PERSISTENT_DIRECTORY/vector_stores/$COLLECTION_NAME. If not specified, the default location is your Langflow configuration directory. For more information, see Memory management options.
Embedding (embedding)EmbeddingsInput parameter. The embedding function to use for the vector store.
Allow Duplicates (allow_duplicates)BooleanInput parameter. If true (default), writes don't check for existing duplicates in the collection, allowing you to store multiple copies of the same content. If false, writes won't add documents that match existing documents already present in the collection. If false, it can strictly enforce deduplication by searching the entire collection or only search the number of records, specified in limit. Only available in Ingest mode.
Ingest Data (ingest_data)Data or DataFrameInput parameter. The records to write to the collection. Records are embedded and indexed for semantic search. Only available in Ingest mode.
Limit (limit)IntegerInput parameter. Limit the number of records to compare when Allow Duplicates is false. This can help improve performance when writing to large collections, but it can result in some duplicate records. Only available in Ingest mode.

Zep bundle

Zep Chat Memory

The Zep Chat Memory component is a legacy component. Replace this component with the Message History component.

This component creates a ZepChatMessageHistory instance, enabling storage and retrieval of chat messages using Zep, a memory server for LLMs.

It accepts the following parameters:

NameTypeDescription
urlMessageTextInput parameter. The URL of the Zep instance. Required.
api_keySecretStringInput parameter. The API Key for authentication with the Zep instance.
api_base_pathDropdownInput parameter. The API version to use. Options include api/v1 or api/v2.
session_idMessageTextInput parameter. The unique identifier for the chat session. Optional.
message_historyBaseChatMessageHistoryOutput parameter. An instance of ZepChatMessageHistory for the session.

See also

Search