The LangSmith playground allows you to use any model that is compliant with the OpenAI API. You can utilize your model by setting the Proxy Provider for in the playground.

Deploy an OpenAI compliant model

Many providers offer OpenAI compliant models or proxy services. Some examples of this include: You can use these providers to deploy your model and get an API endpoint that is compliant with the OpenAI API. Take a look at the full specification for more information.

Use the model in the LangSmith Playground

Once you have deployed a model server, you can use it in the LangSmith Playground. To access the Prompt Settings menu:
  1. Under the Prompts heading select the gear icon next to the model name.
  2. In the Model Configuration tab, select the model to edit in the dropdown.
  3. For the Provider dropdown, select OpenAI Compatible Endpoint.
  4. Add your OpenAI Compatible Endpoint to the Base URL input.
    Model Configuration window in the LangSmith UI with a model selected and the Provider dropdown with OpenAI Compatible Endpoint selected.
If everything is set up correctly, you should see the model’s response in the playground. You can also use this functionality to invoke downstream pipelines as well. For information on how to store your model configuration , refer to Configure prompt settings.