Bedrock
You are currently on a page documenting the use of Amazon Bedrock models as text completion models. Many popular models available on Bedrock are chat completion models.You may be looking for this page instead.
Amazon Bedrock is a fully managed service that makes Foundation Models (FMs) from leading AI startups and Amazon available via an API. You can choose from a wide range of FMs to find the model that is best suited for your use case.This will help you get started with Bedrock completion models (LLMs) using LangChain. For detailed documentation on
Bedrock features and configuration options, please refer to the API reference.
Overview
Integration details
| Class | Package | Local | Serializable | PY support | Downloads | Version |
|---|---|---|---|---|---|---|
| Bedrock | @langchain/community | ❌ | ✅ | ✅ |
Setup
To access Bedrock models you’ll need to create an AWS account, get an API key, and install the@langchain/community integration, along with a few peer dependencies.
Credentials
Head to aws.amazon.com to sign up to AWS Bedrock and generate an API key. Once you’ve done this set the environment variables:Installation
The LangChain Bedrock integration lives in the@langchain/community package:
@aws-sdk/credential-provider-node dependency
and using the web entrypoint:
@aws-sdk/credential-provider-node dependency
and using the web entrypoint:
@aws-sdk/credential-provider-node dependency
and using the web entrypoint:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
Note that some models require specific prompting techniques. For example, Anthropic’s Claude-v2 model will throw an error if the prompt does not start withHuman:.
API reference
For detailed documentation of all Bedrock features and configurations head to the API reference.Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.