Hi @Rahul-vlk
You can use the models both as Databricks-hosted āfirstāpartyā endpoints and via āexternal modelā endpoints that proxy to the providerās hosted APIs, all through a unified, OpenAIācompatible interface for chat, embeddings, vision, and reasoning tasks.Databricks exposes firstāparty endpoints for top models (e.g., Claude, Gemini, GPT families) with a unified OpenAIācompatible API and SDK for chat, embeddings, vision, and reasoning, available across supported regions and clouds.
You can centrally govern and query models hosted by OpenAI, Anthropic, and Google (Vertex/Gemini) by creating an external model endpoint; Databricks forwards your requests (including streaming and tools) to the provider and simplifies auth and governance.
Databricks supports OpenAIāstyle Function calling and Structured outputs during model serving. This applies to Databricksāhosted Foundation Model APIs and to serving endpoints that route to external models.
When you use Anthropic via an external endpoint, Databricks forwards tools parameters, including support for Anthropicās Computer Use (beta), where the provider supports it.
Databricksāhosted Claude endpoints can be used with the OpenAI client and include tool definitions (functions) in requests, enabling agent workflows using Databricksā function calling semantics (example blog shows tools with a Databricks Claude model).
In Summary:
Q: When using foundation models on Databricks, do we only get access to the raw API endpoints/models?
A: No. Databricks exposes foundation models through an OpenAIācompatible API layer and Mosaic AI Model Serving, which provide higherālevel features such as chat completions, tools/function calling, and streaming, rather than just bare ārawā endpoints.
Q: Does Databricks support providerāspecific tooling like OpenAI tools / function calling or Anthropic tools?
A: Yes, for a subset of features that Databricks has standardized. Databricks supports OpenAIāstyle tools/function calling and streaming for Databricksāhosted models (via Foundation Model APIs) and for many external models (OpenAI, Anthropic, etc.) when used through its chat/completions interface.
Q: So do we get the full native feature set of each provider?
A: Not fully. You get the features Databricks has wired into its unified interface (chat, tools/function calling, streaming, safety controls, etc.), but not every niche or newly released providerāspecific endpoint or capability.
Some of the ref docs as you requested:
1) https://docs.databricks.com/aws/en/machine-learning/foundation-model-apis
2) https://docs.databricks.com/aws/en/machine-learning/foundation-model-apis/supported-models
3) https://docs.databricks.com/aws/en/machine-learning/
4) https://docs.databricks.com/aws/en/machine-learning/model-serving/function-calling