cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results forĀ 
Search instead forĀ 
Did you mean:Ā 

model-specific tools (code interpreters, search, etc.) for OpenAI / Anthropic / Google

Rahul-vlk
Visitor

Hi,

I’m happy to see that the latest models from the three major AI labs (OpenAI, Anthropic, and Google) are now available on Databricks. I had a follow-up question about what exactly is supported.

Is Databricks only exposing the base models, or are some of the model-specific tools and features from these providers also supported?

For example, many of these labs provide additional functionality on their own platforms, such as:

  • Code interpreters / code execution

  • Specialized search / RAG tooling

  • Other model-adjacent utilities and features

Anthropic, for instance, has an ā€œinventoryā€ page that shows which of their features/tools are available on each platform (AWS, GCP, Azure, etc.). Based on that table, it looks like none of those extra tools are currently listed as available on Databricks: https://platform.claude.com/docs/en/build-with-claude/overview

So my questions are:

  1. When using these foundation models on Databricks, do we only get access to the raw API endpoints/models?

  2. Or does Databricks support any of the provider-specific tooling (e.g., Anthropic’s extra features, OpenAI tools, etc.) and, if so, which ones?

Thanks in advance for any clarification or pointers to documentation!

1 REPLY 1

iyashk-DB
Databricks Employee
Databricks Employee

Hi @Rahul-vlk 

You can use the models both as Databricks-hosted ā€œfirst‑partyā€ endpoints and via ā€œexternal modelā€ endpoints that proxy to the provider’s hosted APIs, all through a unified, OpenAI‑compatible interface for chat, embeddings, vision, and reasoning tasks.Databricks exposes first‑party endpoints for top models (e.g., Claude, Gemini, GPT families) with a unified OpenAI‑compatible API and SDK for chat, embeddings, vision, and reasoning, available across supported regions and clouds.

You can centrally govern and query models hosted by OpenAI, Anthropic, and Google (Vertex/Gemini) by creating an external model endpoint; Databricks forwards your requests (including streaming and tools) to the provider and simplifies auth and governance.

Databricks supports OpenAI‑style Function calling and Structured outputs during model serving. This applies to Databricks‑hosted Foundation Model APIs and to serving endpoints that route to external models.

When you use Anthropic via an external endpoint, Databricks forwards tools parameters, including support for Anthropic’s Computer Use (beta), where the provider supports it.

Databricks‑hosted Claude endpoints can be used with the OpenAI client and include tool definitions (functions) in requests, enabling agent workflows using Databricks’ function calling semantics (example blog shows tools with a Databricks Claude model).

In Summary:

Q: When using foundation models on Databricks, do we only get access to the raw API endpoints/models?

A: No. Databricks exposes foundation models through an OpenAI‑compatible API layer and Mosaic AI Model Serving, which provide higher‑level features such as chat completions, tools/function calling, and streaming, rather than just bare ā€œrawā€ endpoints.

Q: Does Databricks support provider‑specific tooling like OpenAI tools / function calling or Anthropic tools?

A: Yes, for a subset of features that Databricks has standardized. Databricks supports OpenAI‑style tools/function calling and streaming for Databricks‑hosted models (via Foundation Model APIs) and for many external models (OpenAI, Anthropic, etc.) when used through its chat/completions interface.

Q: So do we get the full native feature set of each provider?

A: Not fully. You get the features Databricks has wired into its unified interface (chat, tools/function calling, streaming, safety controls, etc.), but not every niche or newly released provider‑specific endpoint or capability.

Some of the ref docs as you requested:

1) https://docs.databricks.com/aws/en/machine-learning/foundation-model-apis

2) https://docs.databricks.com/aws/en/machine-learning/foundation-model-apis/supported-models

3) https://docs.databricks.com/aws/en/machine-learning/

4) https://docs.databricks.com/aws/en/machine-learning/model-serving/function-calling

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now