Input and output components in Langflow
Input and output components define where data enters and exits your flow.
Both components accept user input and return a Message
object, but serve different purposes.
The Text Input component accepts a text string input and returns a Message
object containing only the input text. The output does not appear in the Playground.
The Chat Input component accepts multiple input types including text, files, and metadata, and returns a Message
object containing the text along with sender information, session ID, and file attachments.
The Chat Input component provides an interactive chat interface in the Playground.
Chat Input
This component collects user input as Text
strings from the chat and wraps it in a Message object that includes the input text, sender information, session ID, file attachments, and styling properties.
It can optionally store the message in a chat history.
Inputs
Name | Display Name | Info |
---|---|---|
input_value | Text | The Message to be passed as input. |
should_store_message | Store Messages | Store the message in the history. |
sender | Sender Type | The type of sender. |
sender_name | Sender Name | The name of the sender. |
session_id | Session ID | The session ID of the chat. If empty, the current session ID parameter is used. |
files | Files | The files to be sent with the message. |
background_color | Background Color | The background color of the icon. |
chat_icon | Icon | The icon of the message. |
text_color | Text Color | The text color of the name. |
Outputs
Name | Display Name | Info |
---|---|---|
message | Message | The resulting chat message object with all specified properties. |
Message method
The ChatInput
class provides an asynchronous method to create and store a Message
object based on the input parameters.
The Message
object is created in the message_response
method of the ChatInput class using the Message.create()
factory method.
_12message = await Message.create(_12 text=self.input_value,_12 sender=self.sender,_12 sender_name=self.sender_name,_12 session_id=self.session_id,_12 files=self.files,_12 properties={_12 "background_color": background_color,_12 "text_color": text_color,_12 "icon": icon,_12 },_12)
Text Input
The Text Input component accepts a text string input and returns a Message
object containing only the input text.
The output does not appear in the Playground.
Inputs
Name | Display Name | Info |
---|---|---|
input_value | Text | The text/content to be passed as output. |
Outputs
Name | Display Name | Info |
---|---|---|
text | Text | The resulting text message. |
Chat Output
The Chat Output component creates a Message object that includes the input text, sender information, session ID, and styling properties.
The component accepts the following input types.
Inputs
Name | Display Name | Info |
---|---|---|
input_value | Text | The message to be passed as output. |
should_store_message | Store Messages | The flag to store the message in the history. |
sender | Sender Type | The type of sender. |
sender_name | Sender Name | The name of the sender. |
session_id | Session ID | The session ID of the chat. If empty, the current session ID parameter is used. |
data_template | Data Template | The template to convert Data to Text. If the option is left empty, it is dynamically set to the Data's text key. |
background_color | Background Color | The background color of the icon. |
chat_icon | Icon | The icon of the message. |
text_color | Text Color | The text color of the name. |
clean_data | Basic Clean Data | When enabled, DataFrame inputs are cleaned when converted to text. Cleaning removes empty rows, empty lines in cells, and multiple newlines. |
Outputs
Name | Display Name | Info |
---|---|---|
message | Message | The resulting chat message object with all specified properties. |
Text Output
The Text Output takes a single input of text and returns a Message object containing that text.
The output does not appear in the Playground.
Inputs
Name | Display Name | Info |
---|---|---|
input_value | Text | The text to be passed as output. |
Outputs
Name | Display Name | Info |
---|---|---|
text | Text | The resulting text message. |
Chat components example flow
- To use the Chat Input and Chat Output components in a flow, connect them to components that accept or send the Message type.
For this example, connect a Chat Input component to an OpenAI model component's Input port, and then connect the OpenAI model component's Message port to the Chat Output component.
- In the OpenAI model component, in the OpenAI API Key field, add your OpenAI API key.
The flow looks like this:
-
To send a message to your flow, open the Playground, and then enter a message. The OpenAI model component responds. Optionally, in the OpenAI model component, enter a System Message to control the model's response.
-
In the Langflow UI, click your flow name, and then click Logs. The Logs pane opens. Here, you can inspect your component logs.
-
Your first message was sent by the Chat Input component to the OpenAI model component. Click Outputs to view the sent message:
_12 "messages": [_12 {_12 "message": "What's the recommended way to install Docker on Mac M1?",_12 "sender": "User",_12 "sender_name": "User",_12 "session_id": "Session Apr 21, 17:37:04",_12 "stream_url": null,_12 "component_id": "ChatInput-4WKag",_12 "files": [],_12 "type": "text"_12 }_12 ],
- Your second message was sent by the OpenAI model component to the Chat Output component. This is the raw text output of the model's response. The Chat Output component accepts this text as input and presents it as a formatted message. Click Outputs to view the sent message:
_10 "outputs":_10 "text_output":_10 "message": "To install Docker on a Mac with an M1 chip, you should use Docker Desktop for Mac, which is optimized for Apple Silicon. Here’s a step-by-step guide to installing Docker on your M1 Mac:\n\n1._10 ..._10 "type": "text"
Optionally, to view the outputs of each component in the flow, click .
Send chat messages with the API
The Chat Input component is often the entry point for passing messages to the Langflow API. To send the same example messages programmatically to your Langflow server, do the following:
- To get your Langflow endpoint, click Publish, and then click API access.
- Copy the command from the cURL tab, and then paste it in your terminal. It looks similar to this:
_10curl --request POST \_10 --url 'http://127.0.0.1:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \_10 --header 'Content-Type: application/json' \_10 --data '{_10 "input_value": "What's the recommended way to install Docker on Mac M1?",_10 "output_type": "chat",_10 "input_type": "chat"_10}'
- Modify
input_value
so it contains the question,What's the recommended way to install Docker on Mac M1?
.
Note the output_type
and input_type
parameters that are passed with the message. The chat
type provides additional configuration options, and the messages appear in the Playground. The text
type returns only text strings, and does not appear in the Playground.
- Add a custom
session_id
to the message'sdata
object.
_10curl --request POST \_10 --url 'http://127.0.0.1:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \_10 --header 'Content-Type: application/json' \_10 --data '{_10 "input_value": "Whats the recommended way to install Docker on Mac M1",_10 "session_id": "docker-question-on-m1",_10 "output_type": "chat",_10 "input_type": "chat"_10}'
The custom session_id
value starts a new chat session between your client and the Langflow server, and can be useful in keeping conversations and AI context separate.
- Send the POST request. Your request is answered.
- Navigate to the Playground.
A new chat session called
docker-question-on-m1
has appeared, using your uniquesession_id
. - To modify additional parameters with Tweaks for your Chat Input and Chat Output components, click Publish, and then click API access.
- Click Tweaks to modify parameters in the component's
data
object. For example, disabling storing messages from the Chat Input component adds a Tweak to your command:
_13curl --request POST \_13 --url 'http://127.0.0.1:7860/api/v1/run/51eed711-4530-4fdc-9bce-5db4351cc73a?stream=false' \_13 --header 'Content-Type: application/json' \_13 --data '{_13 "input_value": "Text to input to the flow",_13 "output_type": "chat",_13 "input_type": "chat",_13 "tweaks": {_13 "ChatInput-4WKag": {_13 "should_store_message": false_13 }_13 }_13}'
To confirm your command is using the tweak, navigate to the Logs pane and view the request from the Chat Input component.
The value for should_store_message
is false
.