Databricks Lakehouse Platform unifies data, analytics, and AI on one platform.This guide provides a quick overview for getting started with Databricks LLM models. For detailed documentation of all features and configurations head to the API reference.
Overview
Databricks LLM class wraps a completion endpoint hosted as either of these two endpoint types:
- Databricks Model Serving, recommended for production and development,
- Cluster driver proxy app, recommended for interactive development.
Limitations
TheDatabricks LLM class is legacy implementation and has several limitations in the feature compatibility.
- Only supports synchronous invocation. Streaming or async APIs are not supported.
batchAPI is not supported.
ChatDatabricks supports all APIs of ChatModel including streaming, async, batch, etc.
Setup
To access Databricks models you’ll need to create a Databricks account, set up credentials (only if you are outside Databricks workspace), and install required packages.Credentials (only if you are outside Databricks)
If you are running LangChain app inside Databricks, you can skip this step. Otherwise, you need manually set the Databricks workspace hostname and personal access token toDATABRICKS_HOST and DATABRICKS_TOKEN environment variables, respectively. See Authentication Documentation for how to get an access token.
Databricks class.
Installation
The LangChain Databricks integration lives in thelangchain-community package. Also, mlflow >= 2.9 is required to run the code in this notebook.
Wrapping Model Serving Endpoint
Prerequisites
- An LLM was registered and deployed to a Databricks serving endpoint.
- You have “Can Query” permission to the endpoint.
- inputs:
[{"name": "prompt", "type": "string"}, {"name": "stop", "type": "list[string]"}] - outputs:
[{"type": "string"}]
Invocation
Transform Input and Output
Sometimes you may want to wrap a serving endpoint that has imcompatible model signature or you want to insert extra configs. You can use thetransform_input_fn and transform_output_fn arguments to define additional pre/post process.
Connect these docs programmatically to Claude, VSCode, and more via MCP for real-time answers.