Skip to main content
Friendli는 확장 가능하고 효율적인 배포 옵션으로 AI 애플리케이션 성능을 향상시키고 비용 절감을 최적화하며, 높은 수요의 AI 워크로드에 맞춰진 서비스를 제공합니다.
이 튜토리얼은 LangChain을 사용하여 채팅 애플리케이션을 위한 ChatFriendli 통합을 안내합니다. ChatFriendli는 동기 및 비동기 호출을 모두 지원하는 유연한 방식으로 대화형 AI 응답을 생성합니다.

설정

langchain_communityfriendli-client가 설치되어 있는지 확인하세요.
pip install -U langchain-community friendli-client.
Friendli Suite에 로그인하여 Personal Access Token을 생성하고, FRIENDLI_TOKEN 환경 변수로 설정하세요.
import getpass
import os

if "FRIENDLI_TOKEN" not in os.environ:
    os.environ["FRIENDLI_TOKEN"] = getpass.getpass("Friendi Personal Access Token: ")
사용하려는 모델을 선택하여 Friendli 채팅 모델을 초기화할 수 있습니다. 기본 모델은 mixtral-8x7b-instruct-v0-1입니다. 사용 가능한 모델은 docs.friendli.ai에서 확인할 수 있습니다.
from langchain_community.chat_models.friendli import ChatFriendli

chat = ChatFriendli(model="meta-llama-3.1-8b-instruct", max_tokens=100, temperature=0)

사용법

FrienliChat은 비동기 API를 포함한 ChatModel의 모든 메서드를 지원합니다. invoke, batch, generate, stream 기능도 사용할 수 있습니다.
from langchain.messages import HumanMessage, SystemMessage

system_message = SystemMessage(content="Answer questions as short as you can.")
human_message = HumanMessage(content="Tell me a joke.")
messages = [system_message, human_message]

chat.invoke(messages)
AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-d47c1056-54e8-4ea9-ad63-07cf74b834b7-0')
chat.batch([messages, messages])
[AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-36775b84-2a7a-48f0-8c68-df23ffffe4b2-0'),
 AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-b204be41-bc06-4d3a-9f74-e66ab1e60e4f-0')]
chat.generate([messages, messages])
LLMResult(generations=[[ChatGeneration(text="Why don't eggs tell jokes? They'd crack each other up.", message=AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-2e4cb949-8c51-40d5-92a0-cd0ac577db83-0'))], [ChatGeneration(text="Why don't eggs tell jokes? They'd crack each other up.", message=AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-afcdd1be-463c-4e50-9731-7a9f5958e396-0'))]], llm_output={}, run=[RunInfo(run_id=UUID('2e4cb949-8c51-40d5-92a0-cd0ac577db83')), RunInfo(run_id=UUID('afcdd1be-463c-4e50-9731-7a9f5958e396'))], type='LLMResult')
for chunk in chat.stream(messages):
    print(chunk.content, end="", flush=True)
Why don't eggs tell jokes? They'd crack each other up.
비동기 API의 모든 기능도 사용할 수 있습니다: ainvoke, abatch, agenerate, astream.
await chat.ainvoke(messages)
AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-ba8062fb-68af-47b8-bd7b-d1e01b914744-0')
await chat.abatch([messages, messages])
[AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-5d2c77ab-2637-45da-8bbe-1b1f18a22369-0'),
 AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-f1338470-8b52-4d6e-9428-a694a08ae484-0')]
await chat.agenerate([messages, messages])
LLMResult(generations=[[ChatGeneration(text="Why don't eggs tell jokes? They'd crack each other up.", message=AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-d4e44569-39cc-40cc-93fc-de53e599fd51-0'))], [ChatGeneration(text="Why don't eggs tell jokes? They'd crack each other up.", message=AIMessage(content="Why don't eggs tell jokes? They'd crack each other up.", additional_kwargs={}, response_metadata={}, id='run-54647cc2-bee3-4154-ad00-2e547993e6d7-0'))]], llm_output={}, run=[RunInfo(run_id=UUID('d4e44569-39cc-40cc-93fc-de53e599fd51')), RunInfo(run_id=UUID('54647cc2-bee3-4154-ad00-2e547993e6d7'))], type='LLMResult')
async for chunk in chat.astream(messages):
    print(chunk.content, end="", flush=True)
Why don't eggs tell jokes? They'd crack each other up.

Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.
I