Skip to main content
All LangChain integrations with Microsoft Azure and its related projects. Integration packages for Azure AI, Dynamic Sessions, SQL Server are maintained in the langchain-azure repository.

Chat models

We recommend developers start with the (langchain-azure-ai) to access all the models available in Azure AI Foundry.

Azure AI chat completions

Access models like Azure OpenAI, DeepSeek R1, Cohere, Phi and Mistral using the AzureAIChatCompletionsModel class.
pip install -U langchain-azure-ai
Configure your API key and Endpoint.
export AZURE_AI_CREDENTIAL=your-api-key
export AZURE_AI_ENDPOINT=your-endpoint
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel

llm = AzureAIChatCompletionsModel(
    model_name="gpt-4o",
    api_version="2024-05-01-preview",
)

llm.invoke('Tell me a joke and include some emojis')

Embedding models

Azure AI model inference for embeddings

pip install -U langchain-azure-ai
Configure your API key and Endpoint.
export AZURE_AI_CREDENTIAL=your-api-key
export AZURE_AI_ENDPOINT=your-endpoint
from langchain_azure_ai.embeddings import AzureAIEmbeddingsModel

embed_model = AzureAIEmbeddingsModel(
    model_name="text-embedding-ada-002"
)

Vector stores

Azure CosmosDB NoSQL is a fully managed, globally distributed, serverless document database for modern applications. It stores data in flexible JSON documents and uses a SQL-like query language. This provides high performance, low latency, and automatic, elastic scalability. It also features integrated vector search capabilities for AI workloads like generative AI and RAG. This allows you to store, index, and query vector embeddings alongside your operational data in the same database. You can combine vector similarity search with traditional keyword-based search for relevant results and choose from various indexing methods for optimal performance. This unified approach simplifies application architecture and ensures data consistency.
We need to install the azure-cosmos package to use this vector store.
pip install -qU azure-cosmos
from langchain_azure_ai.vectorstores.azure_cosmos_db_no_sql import (
    AzureCosmosDBNoSqlVectorSearch,
)
vector_search = AzureCosmosDBNoSqlVectorSearch.from_documents(
    documents=docs,
    embedding=openai_embeddings,
    cosmos_client=cosmos_client,
    database_name=database_name,
    container_name=container_name,
    vector_embedding_policy=vector_embedding_policy,
    full_text_policy=full_text_policy,
    indexing_policy=indexing_policy,
    cosmos_container_properties=cosmos_container_properties,
    cosmos_database_properties={},
    full_text_search_enabled=True,
)
See a usage example.
Azure CosmosDB Mongo vCore architecture makes it easy to create a database with full native MongoDB support. You can apply your MongoDB experience and continue to use your favorite MongoDB drivers, SDKs, and tools by pointing your application to the API for MongoDB (vCore) cluster’s connection string.
We need to install the pymongo package to use this vector store.
pip install -qU pymongo
from langchain_azure_ai.vectorstores.azure_cosmos_db_mongo_vcore import (
    AzureCosmosDBMongoVCoreVectorSearch,
)

vectorstore = AzureCosmosDBMongoVCoreVectorSearch.from_documents(
    docs,
    openai_embeddings,
    collection=collection,
    index_name=INDEX_NAME,
)
See a usage example.
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.
I