IBM watsonx.ai
features and configurations head to the IBM watsonx.ai.
Overview
Integration details
Class | Package | Local | Serializable | PY support | Downloads | Version |
---|---|---|---|---|---|---|
ChatWatsonx | @langchain/community | ❌ | ✅ | ✅ |
Model features
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|
✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ | ❌ |
Setup
To access IBM watsonx.ai models you’ll need to create a/an IBM watsonx.ai account, get an API key, and install the@langchain/community
integration package.
Credentials
Head to IBM Cloud to sign up to IBM watsonx.ai and generate an API key or provide any other authentication form as presented below.IAM authentication
Bearer token authentication
IBM watsonx.ai software authentication
IAM authentication
Bearer token authentication
IBM watsonx.ai software authentication
Installation
The LangChain IBM watsonx.ai integration lives in the@langchain/community
package:
Instantiation
Now we can instantiate our model object and generate chat completions:- You must provide
spaceId
,projectId
oridOrName
(deployment id) unless you use lighweight engine which works without specifying either (refer to watsonx.ai docs) - Depending on the region of your provisioned service instance, use correct serviceUrl.
Invocation
Chaining
We can chain our model with a prompt template like so:Streaming the Model output
Tool calling
API reference
For detailed documentation of allIBM watsonx.ai
features and configurations head to the API reference: API docs