Hi @Rahul-vlk
You can use the models both as Databricks-hosted “first‑party” endpoints and via “external model” endpoints that proxy to the provider’s hosted APIs, all through a unified, OpenAI‑compatible interface for chat, embeddings, vision, and reasoning tasks.Databricks exposes first‑party endpoints for top models (e.g., Claude, Gemini, GPT families) with a unified OpenAI‑compatible API and SDK for chat, embeddings, vision, and reasoning, available across supported regions and clouds.
You can centrally govern and query models hosted by OpenAI, Anthropic, and Google (Vertex/Gemini) by creating an external model endpoint; Databricks forwards your requests (including streaming and tools) to the provider and simplifies auth and governance.
Databricks supports OpenAI‑style Function calling and Structured outputs during model serving. This applies to Databricks‑hosted Foundation Model APIs and to serving endpoints that route to external models.
When you use Anthropic via an external endpoint, Databricks forwards tools parameters, including support for Anthropic’s Computer Use (beta), where the provider supports it.
Databricks‑hosted Claude endpoints can be used with the OpenAI client and include tool definitions (functions) in requests, enabling agent workflows using Databricks’ function calling semantics (example blog shows tools with a Databricks Claude model).
In Summary:
Q: When using foundation models on Databricks, do we only get access to the raw API endpoints/models?
A: No. Databricks exposes foundation models through an OpenAI‑compatible API layer and Mosaic AI Model Serving, which provide higher‑level features such as chat completions, tools/function calling, and streaming, rather than just bare “raw” endpoints.
Q: Does Databricks support provider‑specific tooling like OpenAI tools / function calling or Anthropic tools?
A: Yes, for a subset of features that Databricks has standardized. Databricks supports OpenAI‑style tools/function calling and streaming for Databricks‑hosted models (via Foundation Model APIs) and for many external models (OpenAI, Anthropic, etc.) when used through its chat/completions interface.
Q: So do we get the full native feature set of each provider?
A: Not fully. You get the features Databricks has wired into its unified interface (chat, tools/function calling, streaming, safety controls, etc.), but not every niche or newly released provider‑specific endpoint or capability.
Some of the ref docs as you requested:
1) https://docs.databricks.com/aws/en/machine-learning/foundation-model-apis
2) https://docs.databricks.com/aws/en/machine-learning/foundation-model-apis/supported-models
3) https://docs.databricks.com/aws/en/machine-learning/
4) https://docs.databricks.com/aws/en/machine-learning/model-serving/function-calling