Friendli enhances AI application performance and optimizes cost savings with scalable, efficient deployment options, tailored for high-demand AI workloads.This tutorial guides you through integrating
ChatFriendli for chat applications using LangChain. ChatFriendli offers a flexible approach to generating conversational AI responses, supporting both synchronous and asynchronous calls.
Setup
Ensure the@langchain/community is installed.
npm
FRIENDLI_TOKEN environment.
You can set team id as FRIENDLI_TEAM environment.
You can initialize a Friendli chat model with selecting the model you want to use. The default model is meta-llama-3-8b-instruct. You can check the available models at docs.friendli.ai.
Usage
Related
- Chat model conceptual guide
- Chat model how-to guides
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.