ChatDeepSeek features and configurations head to the API reference.
Overview
Integration details
| Class | Package | Local | Serializable | PY support | Downloads | Version |
|---|---|---|---|---|---|---|
ChatDeepSeek | @langchain/deepseek | ❌ (see Ollama) | beta | ✅ |
Model features
See the links in the table headers below for guides on how to use specific features.| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Token usage | Logprobs |
|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
deepseek-reasoner.
Setup
To access DeepSeek models you’ll need to create a DeepSeek account, get an API key, and install the@langchain/deepseek integration package.
You can also access the DeepSeek API through providers like Together AI or Ollama.
Credentials
Head to https://deepseek.com/ to sign up to DeepSeek and generate an API key. Once you’ve done this set theDEEPSEEK_API_KEY environment variable:
Installation
The LangChain ChatDeepSeek integration lives in the@langchain/deepseek package:
Instantiation
Now we can instantiate our model object and generate chat completions:API reference
For detailed documentation of all ChatDeepSeek features and configurations head to the API reference: https://api.js.langchain.com/classes/_langchain_deepseek.ChatDeepSeek.htmlConnect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.