cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Agent outside databricks communication with databricks MCP server

maikel
New Contributor II

Hello Community!

I have a following use case in my project:

User -> AI agent -> MCP Server -> Databricks data from unity catalog.

- AI agent is not created in the databricks
- MCP server is created in the databricks and should expose tools to get data from given unity catalog.

I can see in https://docs.databricks.com/aws/en/generative-ai/mcp/custom-mcp that it should be possible to host MCP in Databricks app however my question is, would it be possible to connect it with the agent outside the databricks? Additionally, shall I implement in my custom mcp server methods to work with databricks data via REST or it might be somehow shortened due to the fact that MCP is in the databricks. 

Thanks a lot for the support!
Michal

2 REPLIES 2

mark_ott
Databricks Employee
Databricks Employee

Yes, it is possible to connect an external AI agent to an MCP (Model-Serving Custom Platform) server hosted within Databricks, and there are some benefits and options for working with Databricks data that depend on your architecture choices.

Connecting External AI Agents to Databricks MCP

  • You can host your custom MCP server within Databricks, and expose its endpoints via the authentication and networking setup that Databricks provides (such as allowing external access via secure endpoints or APIs).

  • As long as your MCP server is accessible over the network (using public or secure/private endpoints that you configure), an AI agent running outside of Databricks can connect to it by making REST API calls or using SDKs compatible with the MCP interface.

MCP Methods for Unity Catalog Data

  • Direct Databricks Access: Since your MCP server is hosted inside Databricks, it typically has faster and more privileged access to Databricks-managed resources, like Unity Catalog. You can leverage Databricks-native APIs or even direct Spark jobs within the MCP server's code to access and process Unity Catalog data efficiently.

  • REST vs. Native Access:

    • If you host MCP inside Databricks, your MCP methods can directly interact with Unity Catalog using Databricks SDKs (Python, Scala, SQL) or Spark APIs without the overhead of going through REST APIs.

    • If you plan to keep MCP outside Databricks, you would have to use Databricks REST APIs to interact with Unity Catalog, which adds additional network calls and potential complexity.

  • Recommended: Since MCP is inside Databricks, implement direct access methods using Databricks APIs rather than wrapping REST callsโ€”this approach is more streamlined, efficient, and secure.

Summary Table

Component Location Best Access Method Notes
AI Agent Outside Databricks MCP REST API Connects to MCP over network
MCP Server Inside Databricks Direct Databricks/Spark APIs Native, fast, secure
MCP Server Outside Databricks Databricks REST API Less direct, more overhead
Unity Catalog (Data Layer) Managed by Databricks โ€” โ€”
 
 

Key Recommendations

  • Connect your external AI agent to the MCP server via REST without issues as long as networking and permissions are properly configured.

  • Since MCP is running in Databricks, use internal Databricks APIs for Unity Catalog access instead of building REST-based data access in your MCP server logic.

This approach will offer you more efficient, secure, and robust access to your data within Databricks while supporting external AI agent connectivity.

maikel
New Contributor II

Hello Mark!

thanks a lot for the response! By MCP I meant Model Context Protocol Server, not Model-Serving Custom Platform.

I am thinking about those two approaches:

  • AI agent outside databricks -> MCP server outside databricks -> data in Unity Catalog

 

The reason for such approach is that I would like to have more control over MCP deployment, resources etc. In this scenario, as you mentioned we should go with REST API between MCP and Unity Catalog data. Could you please advice what is the best option for the authentication through the code (without browser involvement)? Can I create secure endpoint for this? If so, how can I do this? 

  • AI agent outside databricks -> MCP server inside databricks -> data in Unity Catalog

Can you please share more details how can I create secure endpoint in databricks that my agent can authenticate to MCP? 
If I have MCP inside databricks does it mean that in my Python code I can just use e.g. Pyspark functions to access it? No authorization required?

Thank you!

Best regards,

Michal