Overview
Integration details
| Provider | Class | Package | Local | Serializable | Downloads | Version |
|---|---|---|---|---|---|---|
| ModelScope | ModelScopeChatEndpoint | langchain-modelscope-integration | ❌ | ❌ |
Setup
To access ModelScope chat endpoint you’ll need to create a ModelScope account, get an SDK token, and install thelangchain-modelscope-integration integration package.
Credentials
Head to ModelScope to sign up to ModelScope and generate an SDK token. Once you’ve done this set theMODELSCOPE_SDK_TOKEN environment variable:
Installation
The LangChain ModelScope integration lives in thelangchain-modelscope-integration package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
API reference
For detailed documentation of all ModelScopeChatEndpoint features and configurations head to the reference: modelscope.cn/docs/model-service/API-Inference/introConnect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.