Overview
Integration details
| Class | Package | Local | Serializable | JS support | Downloads | Version |
|---|---|---|---|---|---|---|
| ChatGroq | langchain-groq | ❌ | beta | ✅ |
Model features
| Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
|---|---|---|---|---|---|---|---|---|---|
| ✅ | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ |
Setup
To access Groq models you’ll need to create a Groq account, get an API key, and install thelangchain-groq integration package.
Credentials
Head to the Groq console to sign up to Groq and generate an API key. Once you’ve done this set the GROQ_API_KEY environment variable:Installation
The LangChain Groq integration lives in thelangchain-groq package:
Instantiation
Now we can instantiate our model object and generate chat completions.Reasoning FormatIf you choose to set a
reasoning_format, you must ensure that the model you are using supports it. You can find a list of supported models in the Groq documentation.Invocation
API reference
For detailed documentation of all ChatGroq features and configurations head to the API reference.Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.