Skip to main content

Backend-Only

info

This page may contain outdated information. It will be updated as soon as possible.

You can run Langflow in --backend-only mode to expose your Langflow app as an API, without running the frontend UI.

Start langflow in backend-only mode with python3 -m langflow run --backend-only.

The terminal prints Welcome to ⛓ Langflow, and a blank window opens at http://127.0.0.1:7864/all. Langflow will now serve requests to its API without the frontend running.

Prerequisites

Download your flow's curl call

  1. Click API.
  2. Click curl > Copy code and save the code to your local machine. It will look something like this:

_12
curl -X POST \\
_12
"<http://127.0.0.1:7864/api/v1/run/ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef?stream=false>" \\
_12
-H 'Content-Type: application/json'\\
_12
-d '{"input_value": "message",
_12
"output_type": "chat",
_12
"input_type": "chat",
_12
"tweaks": {
_12
"Prompt-kvo86": {},
_12
"OpenAIModel-MilkD": {},
_12
"ChatOutput-ktwdw": {},
_12
"ChatInput-xXC4F": {}
_12
}}'

Note the flow ID of ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef. You can find this ID in the UI as well to ensure you're querying the right flow.

Start Langflow in backend-only mode

  1. Stop Langflow with Ctrl+C.
  2. Start langflow in backend-only mode with python3 -m langflow run --backend-only. The terminal prints Welcome to ⛓ Langflow, and a blank window opens at http://127.0.0.1:7864/all. Langflow will now serve requests to its API.
  3. Run the curl code you copied from the UI. You should get a result like this:

_10
{"session_id":"ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef:bf81d898868ac87e1b4edbd96c131c5dee801ea2971122cc91352d144a45b880","outputs":[{"inputs":{"input_value":"hi, are you there?"},"outputs":[{"results":{"result":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?"},"artifacts":{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI"},"messages":[{"message":"Arrr, ahoy matey! Aye, I be here. What be ye needin', me hearty?","sender":"Machine","sender_name":"AI","component_id":"ChatOutput-ktwdw"}],"component_display_name":"Chat Output","component_id":"ChatOutput-ktwdw","used_frozen_result":false}]}]}%

Again, note that the flow ID matches. Langflow is receiving your POST request, running the flow, and returning the result, all without running the frontend. Cool!

Download your flow's Python API call

Instead of using curl, you can download your flow as a Python API call instead.

  1. Click API.
  2. Click Python API > Copy code and save the code to your local machine. The code will look something like this:

_40
import requests
_40
from typing import Optional
_40
_40
BASE_API_URL = "<http://127.0.0.1:7864/api/v1/run>"
_40
FLOW_ID = "ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef"
_40
# You can tweak the flow by adding a tweaks dictionary
_40
# e.g {"OpenAI-XXXXX": {"model_name": "gpt-4"}}
_40
_40
def run_flow(message: str,
_40
flow_id: str,
_40
output_type: str = "chat",
_40
input_type: str = "chat",
_40
tweaks: Optional[dict] = None,
_40
api_key: Optional[str] = None) -> dict:
_40
"""Run a flow with a given message and optional tweaks.
_40
_40
:param message: The message to send to the flow
_40
:param flow_id: The ID of the flow to run
_40
:param tweaks: Optional tweaks to customize the flow
_40
:return: The JSON response from the flow
_40
"""
_40
api_url = f"{BASE_API_URL}/{flow_id}"
_40
payload = {
_40
"input_value": message,
_40
"output_type": output_type,
_40
"input_type": input_type,
_40
}
_40
headers = None
_40
if tweaks:
_40
payload["tweaks"] = tweaks
_40
if api_key:
_40
headers = {"x-api-key": api_key}
_40
response = requests.post(api_url, json=payload, headers=headers)
_40
return response.json()
_40
_40
# Setup any tweaks you want to apply to the flow
_40
_40
message = "message"
_40
_40
print(run_flow(message=message, flow_id=FLOW_ID))

  1. Run your Python app:

_10
python3 app.py

The result is similar to the curl call:


_10
{'session_id': 'ef7e0554-69e5-4e3e-ab29-ee83bcd8d9ef:bf81d898868ac87e1b4edbd96c131c5dee801ea2971122cc91352d144a45b880', 'outputs': [{'inputs': {'input_value': 'message'}, 'outputs': [{'results': {'result': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!"}, 'artifacts': {'message': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!", 'sender': 'Machine', 'sender_name': 'AI'}, 'messages': [{'message': "Arrr matey! What be yer message for this ol' pirate? Speak up or walk the plank!", 'sender': 'Machine', 'sender_name': 'AI', 'component_id': 'ChatOutput-ktwdw'}], 'component_display_name': 'Chat Output', 'component_id': 'ChatOutput-ktwdw', 'used_frozen_result': False}]}]}

Your Python app POSTs to your Langflow server, and the server runs the flow and returns the result.

See API for more ways to interact with your headless Langflow server.

Hi, how can I help you?