Skip to content

Prompts

A prompt is an instruction you provide to a language model to achieve a specific result. With a prompt, your task is to help the LLM understand what you're trying to accomplish so that it can generate coherent and correct results. A good prompt should be structured to provide context and guidance to the model, allowing it to generate a meaningful response. Prompts are constrained by the context length of the large language model used.

Prompt Studio prompts are used in promptbook steps.

Writing good prompts has become a domain of its own, and there are many resources out there on how to write better prompts. An excellent place to start is the prompt engineering guide.

Prompt Chains

We recommend splitting more complex tasks in your prompts into smaller, independent steps. You can do it by adding a new step in your promptbooks. By default, results generated in a step are not connected and do not influence other steps. By splitting complex prompts into several steps, you can:

  • improve the quality of the results you get
  • identify what parts of a task an LLM struggles with
  • keep transparent how a task is solved

Meta Prompts

A meta prompt is a prompt used to generate other prompts that perform well at specific tasks. Meta prompts usually provide some framework or format that the language model will follow when creating prompts given some user instructions.