Chat models
We recommend developers start with the (langchain-azure-ai) to access all the models available in Azure AI Foundry.
Azure AI chat completions
Access models like Azure OpenAI, DeepSeek R1, Cohere, Phi and Mistral using theAzureAIChatCompletionsModel class.
Embedding models
Azure AI model inference for embeddings
Vector stores
Azure CosmosDB NoSQL Vector Search
Azure CosmosDB NoSQL is a fully managed, globally distributed, serverless document database for modern applications. It stores data in flexible JSON documents and uses a SQL-like query language. This provides high performance, low latency, and automatic, elastic scalability. It also features integrated vector search capabilities for AI workloads like generative AI and RAG. This allows you to store, index, and query vector embeddings alongside your operational data in the same database. You can combine vector similarity search with traditional keyword-based search for relevant results and choose from various indexing methods for optimal performance. This unified approach simplifies application architecture and ensures data consistency.We need to install the
azure-cosmos package to use this vector store.
Azure CosmosDB Mongo vCore Vector Search
Azure CosmosDB Mongo vCore architecture makes it easy to create a database with full native MongoDB support. You can apply your MongoDB experience and continue to use your favorite MongoDB drivers, SDKs, and tools by pointing your application to the API for MongoDB (vCore) cluster’s connection string.We need to install the
pymongo package to use this vector store.
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.