Skip to main content

Basic Prompting

Prompts serve as the inputs to a large language model (LLM), acting as the interface between human instructions and computational tasks.

By submitting natural language requests in a prompt to an LLM, you can obtain answers, generate text, and solve problems.

This article demonstrates how to use Langflow's prompt tools to issue basic prompts to an LLM, and how various prompting strategies can affect your outcomes.

Prerequisites


Create the basic prompting flow

  1. From the Langflow dashboard, click New Flow.

  2. Select Basic Prompting.

  3. The Basic Prompting flow is created.

This flow allows you to chat with the OpenAI component through the Prompt component.

Examine the Prompt component. The Template field instructs the LLM to Answer the user as if you were a pirate. This should be interesting...

  1. To create an environment variable for the OpenAI component, in the OpenAI API Key field, click the Globe button, and then click Add New Variable.

    1. In the Variable Name field, enter openai_api_key.
    2. In the Value field, paste your OpenAI API Key (sk-...).
    3. Click Save Variable.

Run the basic prompting flow

  1. Click the Playground button on the control panel (bottom right side of the workspace). This is where you can interact with your AI.
  2. Type a message and press Enter. The bot should respond in a markedly piratical manner!

Modify the prompt for a different result

  1. To modify your prompt results, in the Prompt template, click the Template field. The Edit Prompt window opens.
  2. Change Answer the user as if you were a pirate to a different character, perhaps Answer the user as if you were Hermione Granger.
  3. Run the workflow again. The response will be markedly different.

Hi, how can I help you?