Overview
Integration details
| Class | Package | Local | Serializable | JS support |
|---|---|---|---|---|
| ChatOCIGenAI | langchain-community | ❌ | ❌ | ❌ |
Model features
| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
|---|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ❌ | ❌ | ❌ |
Setup
To access OCIGenAI models you’ll need to install theoci and langchain-community packages.
Credentials
The credentials and authentication methods supported for this integration are equivalent to those used with other OCI services and follow the standard SDK authentication methods, specifically API Key, session token, instance principal, and resource principal. API key is the default authentication method used in the examples above. The following example demonstrates how to use a different authentication method (session token)Installation
The LangChain OCIGenAI integration lives in thelangchain-community package and you will also need to install the oci package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
API reference
For detailed documentation of all ChatOCIGenAI features and configurations head to the API reference: python.langchain.com/api_reference/community/chat_models/langchain_community.chat_models.oci_generative_ai.ChatOCIGenAI.htmlConnect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.