AzureOpenAIEmbeddings features and configuration options, please refer to the API reference.
Previously, LangChain.js supported integration with Azure OpenAI using the dedicated Azure OpenAI SDK. This SDK is now deprecated in favor of the new Azure integration in the OpenAI SDK, which allows to access the latest OpenAI models and features the same day they are released, and allows seamless transition between the OpenAI API and Azure OpenAI.If you are using Azure OpenAI with the deprecated SDK, see the migration guide to update to the new API.
Overview
Integration details
| Class | Package | Local | Py support | Downloads | Version |
|---|---|---|---|---|---|
| AzureOpenAIEmbeddings | @langchain/openai | ❌ | ✅ |
Setup
To access Azure OpenAI embedding models you’ll need to create an Azure account, get an API key, and install the@langchain/openai integration package.
Credentials
You’ll need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following this guide. Once you have your instance running, make sure you have the name of your instance and key. You can find the key in the Azure Portal, under the “Keys and Endpoint” section of your instance. If you’re using Node.js, you can define the following environment variables to use the service:Installation
The LangChain AzureOpenAIEmbeddings integration lives in the@langchain/openai package:
You can find the list of supported API versions in the Azure OpenAI documentation.
If
AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME is not defined, it will fall back to the value of AZURE_OPENAI_API_DEPLOYMENT_NAME for the deployment name. The same applies to the azureOpenAIApiEmbeddingsDeploymentName parameter in the AzureOpenAIEmbeddings constructor, which will fall back to the value of azureOpenAIApiDeploymentName if not defined.Indexing and Retrieval
Embedding models are often used in retrieval-augmented generation (RAG) flows, both as part of indexing data as well as later retrieving it. For more detailed instructions, please see our RAG tutorials under the Learn tab. Below, see how to index and retrieve data using theembeddings object we initialized above. In this example, we will index and retrieve a sample document using the demo MemoryVectorStore.
Direct Usage
Under the hood, the vectorstore and retriever implementations are callingembeddings.embedDocument(...) and embeddings.embedQuery(...) to create embeddings for the text(s) used in fromDocuments and the retriever’s invoke operations, respectively.
You can directly call these methods to get embeddings for your own use cases.
Embed single texts
You can embed queries for search withembedQuery. This generates a vector representation specific to the query:
Embed multiple texts
You can embed multiple texts for indexing withembedDocuments. The internals used for this method may (but do not have to) differ from embedding queries:
Using Azure Managed Identity
If you’re using Azure Managed Identity, you can configure the credentials like this:Using a different domain
If your instance is hosted under a domain other than the defaultopenai.azure.com, you’ll need to use the alternate AZURE_OPENAI_BASE_PATH environment variable.
For example, here’s how you would connect to the domain https://westeurope.api.microsoft.com/openai/deployments/{DEPLOYMENT_NAME}:
Custom headers
You can specify custom headers by passing in aconfiguration field:
configuration field also accepts other ClientOptions parameters accepted by the official SDK.
Note: The specific header api-key currently cannot be overridden in this manner and will pass through the value from azureOpenAIApiKey.
Migration from Azure OpenAI SDK
If you are using the deprecated Azure OpenAI SDK with the@langchain/azure-openai package, you can update your code to use the new Azure integration following these steps:
-
Install the new
@langchain/openaipackage and remove the previous@langchain/azure-openaipackage:npm -
Update your imports to use the new
AzureOpenAIEmbeddingsclasse from the@langchain/openaipackage: -
Update your code to use the new
AzureOpenAIEmbeddingsclass and pass the required parameters:Notice that the constructor now requires theazureOpenAIApiInstanceNameparameter instead of theazureOpenAIEndpointparameter, and adds theazureOpenAIApiVersionparameter to specify the API version.-
If you were using Azure Managed Identity, you now need to use the
azureADTokenProviderparameter to the constructor instead ofcredentials, see the Azure Managed Identity section for more details. -
If you were using environment variables, you now have to set the
AZURE_OPENAI_API_INSTANCE_NAMEenvironment variable instead ofAZURE_OPENAI_API_ENDPOINT, and add theAZURE_OPENAI_API_VERSIONenvironment variable to specify the API version.
-
If you were using Azure Managed Identity, you now need to use the
API reference
For detailed documentation of all AzureOpenAIEmbeddings features and configurations head to the API reference.Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.