Skip to main content

Basic Prompting

warning

This page may contain outdated information. It will be updated as soon as possible.

Prompts serve as the inputs to a large language model (LLM), acting as the interface between human instructions and computational tasks.

By submitting natural language requests in a prompt to an LLM, you can obtain answers, generate text, and solve problems.

This article demonstrates how to use Langflow's prompt tools to issue basic prompts to an LLM, and how various prompting strategies can affect your outcomes.

Prerequisites

Create the basic prompting project

  1. From the Langflow dashboard, click New Project.
  2. Select Basic Prompting.
  3. The Basic Prompting flow is created.
Docusaurus themed imageDocusaurus themed image

This flow allows you to chat with the OpenAI component via a Prompt component. Examine the Prompt component. The Template field instructs the LLM to Answer the user as if you were a pirate. This should be interesting...

  1. To create an environment variable for the OpenAI component, in the OpenAI API Key field, click the Globe button, and then click Add New Variable.
    1. In the Variable Name field, enter openai_api_key.
    2. In the Value field, paste your OpenAI API Key (sk-...).
    3. Click Save Variable.

Run the basic prompting flow

  1. Click the Run button. The Interaction Panel opens, where you can converse with your bot.
  2. Type a message and press Enter. The bot responds in a markedly piratical manner!

Modify the prompt for a different result

  1. To modify your prompt results, in the Prompt template, click the Template field. The Edit Prompt window opens.
  2. Change Answer the user as if you were a pirate to a different character, perhaps Answer the user as if you were Harold Abelson.
  3. Run the basic prompting flow again. The response will be markedly different.

Hi, how can I help you?