Prompts guide the behavior of large language models (LLM). Prompt engineering is the process of crafting, testing, and refining the instructions you give to an LLM so it produces reliable and useful responses. LangSmith provides tools to create, version, test, and collaborate on prompts. You’ll also encounter common concepts like prompt templates, which let you reuse structured prompts, and variables, which allow you to dynamically insert values (such as a user’s question) into a prompt. In this quickstart, you’ll create, test, and improve prompts using either the UI or the SDK. This quickstart will use OpenAI as the example LLM provider, but the same workflow applies across other providers.
If you prefer to watch a video on getting started with prompt engineering, refer to the quickstart Video guide.

Prerequisites

Before you begin, make sure you have: Select the tab for UI or SDK workflows:

1. Set workspace secret

In the LangSmith UI for your workspace, ensure that your OpenAI API key is set as a workspace.
  1. Navigate to Settings and then move to the Secrets tab.
  2. On the Workspace Secrets page, select Add secret and enter the OPENAI_API_KEY and your API key as the Value.
  3. Select Save secret.
When adding workspace secrets in the LangSmith UI, make sure the secret keys match the environment variable names expected by your model provider.

2. Create a prompt

  1. In the LangSmith UI, navigate to the Prompts section in the left-hand menu.
  2. Click on + Prompt to create a prompt.
  3. Modify the prompt by editing or adding prompts and input variables as needed.
Prompt playground with the system prompt ready for editing.

3. Test a prompt

  1. Under the Prompts heading select the gear icon next to the model name, which will launch the Prompt Settings window on the Model Configuration tab.
  2. Set the model configuration you want to use. The Provider and Model you select will determine the parameters that are configurable on this configuration page. Once set, click Save as.
    Model Configuration window in the LangSmith UI, settings for Provider, Model, Temperature, Max Output Tokens, Top P, Presence Penalty, Frequency Penalty, Reasoning Effort, etc.
  3. Specify the input variables you would like to test in the Inputs box and then click Start.
    The input box with a question entered. The output box contains the response to the prompt.
    To learn about more options for configuring your prompt in the Playground, refer to Configure prompt settings.
  4. After testing and refining your prompt, click Save to store it for future use.

4. Iterate on a prompt

LangSmith allows for team-based prompt iteration. Workspace members can experiment with prompts in the playground and save their changes as a new commit when ready.To improve your prompts:
  • Reference the documentation provided by your model provider for best practices in prompt creation, such as:
  • Build and refine your prompts with the Prompt Canvas—an interactive tool in LangSmith. Learn more in the Prompt Canvas guide.
  • Tag specific commits to mark important moments in your commit history.
    1. To create a commit, navigate to the Playground and select Commit. Choose the prompt to commit changes to and then Commit.
    2. Navigate to Prompts in the left-hand menu. Select the prompt. Once on the prompt’s detail page, move to the Commits tab. Find the tag icon to Add a Commit Tag.
    The tag, the commit tag box with the commit label, and the commit tag name box to create the tag.

Next steps

Video guide