Friendli enhances AI application performance and optimizes cost savings with scalable, efficient deployment options, tailored for high-demand AI workloads.This tutorial guides you through integrating
Friendli with LangChain.
Setup
Ensure thelangchain_community and friendli-client are installed.
FRIENDLI_TOKEN environment.
meta-llama-3.1-8b-instruct. You can check the available models at friendli.ai/docs.
Usage
Frienli supports all methods of LLM including async APIs.
You can use functionality of invoke, batch, generate, and stream.
ainvoke, abatch, agenerate, and astream.
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.