You are currently on a page documenting the use of Azure OpenAI text completion models. The latest and most popular Azure OpenAI models are chat completion models.Unless you are specifically using
gpt-3.5-turbo-instruct, you are probably looking for this page instead.Previously, LangChain.js supported integration with Azure OpenAI using the dedicated Azure OpenAI SDK. This SDK is now deprecated in favor of the new Azure integration in the OpenAI SDK, which allows to access the latest OpenAI models and features the same day they are released, and allows seemless transition between the OpenAI API and Azure OpenAI.If you are using Azure OpenAI with the deprecated SDK, see the migration guide to update to the new API.
AzureOpenAI features and configuration options, please refer to the API reference.
Overview
Integration details
| Class | Package | Local | Serializable | PY support | Downloads | Version |
|---|---|---|---|---|---|---|
| AzureOpenAI | @langchain/openai | ❌ | ✅ | ✅ |
Setup
To access AzureOpenAI models you’ll need to create an Azure account, get an API key, and install the@langchain/openai integration package.
Credentials
Head to azure.microsoft.com to sign up to AzureOpenAI and generate an API key. You’ll also need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following this guide. Once you have your instance running, make sure you have the name of your instance and key. You can find the key in the Azure Portal, under the “Keys and Endpoint” section of your instance. If you’re using Node.js, you can define the following environment variables to use the service:AzureOpenAI constructor.
If you want to get automated tracing of your model calls you can also set your LangSmith API key by uncommenting below:
Installation
The LangChain AzureOpenAI integration lives in the@langchain/openai package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
Using Azure Managed Identity
If you’re using Azure Managed Identity, you can configure the credentials like this:Using a different domain
If your instance is hosted under a domain other than the defaultopenai.azure.com, you’ll need to use the alternate AZURE_OPENAI_BASE_PATH environment variable.
For example, here’s how you would connect to the domain https://westeurope.api.microsoft.com/openai/deployments/{DEPLOYMENT_NAME}:
Migration from Azure OpenAI SDK
If you are using the deprecated Azure OpenAI SDK with the@langchain/azure-openai package, you can update your code to use the new Azure integration following these steps:
-
Install the new
@langchain/openaipackage and remove the previous@langchain/azure-openaipackage: -
Update your imports to use the new
AzureOpenAIand @[AzureChatOpenAI] classes from the@langchain/openaipackage: -
Update your code to use the new
AzureOpenAIand @[AzureChatOpenAI] classes and pass the required parameters:Notice that the constructor now requires theazureOpenAIApiInstanceNameparameter instead of theazureOpenAIEndpointparameter, and adds theazureOpenAIApiVersionparameter to specify the API version.-
If you were using Azure Managed Identity, you now need to use the
azureADTokenProviderparameter to the constructor instead ofcredentials, see the Azure Managed Identity section for more details. -
If you were using environment variables, you now have to set the
AZURE_OPENAI_API_INSTANCE_NAMEenvironment variable instead ofAZURE_OPENAI_API_ENDPOINT, and add theAZURE_OPENAI_API_VERSIONenvironment variable to specify the API version.
-
If you were using Azure Managed Identity, you now need to use the
API reference
For detailed documentation of all AzureOpenAI features and configurations head to the API reference.Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.