cancel
Showing results for 
Search instead for 
Did you mean: 
Generative AI
Explore discussions on generative artificial intelligence techniques and applications within the Databricks Community. Share ideas, challenges, and breakthroughs in this cutting-edge field.
cancel
Showing results for 
Search instead for 
Did you mean: 

Custom MCP deployment

maikel
New Contributor III

Hi Community!

I have a question - could somebody please guide me how to deploy my custom MCP server to databricks?
What I would like to achieve is the following:

  • I have a unity catalog in databricks for which I would like to have MCP
  • if the data in unity catalog is table I would like to have tools where I can query the table, if there are volumes I would like to have tools for those volumes
  • I would like to get URL to this MCP that agents outside databricks can communicate with it

How can I authenticate for this MCP?
How can I decide whether given use has permissions to use this MCP? Is it connected with access to the given unity catalog somehow?
Shall I create my custom mcp server or I can use Managed MCP already? 

Today I have tried to connect to Managed MCP for one of my catalogs but the list of tools is empty there.

Thanks a lot in advance!

1 ACCEPTED SOLUTION

Accepted Solutions

stbjelcevic
Databricks Employee
Databricks Employee

Hi @maikel ,

Yes - look at the instructions here for programmatic ways to perform OAuth service principle auth: https://docs.databricks.com/aws/en/dev-tools/auth/oauth-m2m?language=Python#account-level-operations...

You will still need to initially create the service principle, client id, and secret id with the UI though. Once you have done that, the rest can be done with code.

View solution in original post

9 REPLIES 9

stbjelcevic
Databricks Employee
Databricks Employee

Hi @maikel ,

Custom hosted MCP servers are implemented as a Databricks App - the documentation here should help you get started: https://docs.databricks.com/aws/en/generative-ai/mcp/custom-mcp

To answer some of your questions more specifically: 

  • Tables: If you want to just access tables, you can use managed DBSQL MCP server to run SQL across Unity Catalog tables, or use the managed Genie MCP server for natural-language queries over structured data. You can also expose deterministic tools as Unity Catalog Functions and use the managed UC Functions MCP server to run them.

  • Volumes: Databricks does not currently expose Unity Catalog Volumes as first‑class managed MCP tools, but it may be something in the future. Today, you can either model the needed actions as Unity Catalog Functions (e.g., file read/parse helpers) or host a custom MCP server on Databricks Apps that implements volume operations under your own API contract.

  • Custom MCP auth: OAuth
  • Managed and external: OAuth or Personal Access Token (PAT)
  • Authorization is tied to Unity Catalog: Managed MCP enforces UC privileges for the calling identity (on‑behalf‑of‑user), so tool discovery and execution only include what the user can see and run. Grant USE CATALOG/USE SCHEMA/SELECT/EXECUTE as appropriate on the resources your agent needs.

Should you build a custom MCP server or use Managed MCP?
My recommendation would be to start with managed because it should be easier to get the ball rolling. If you find that you can't do everything with managed, you can switch to custom. 

 

maikel
New Contributor III

Hi @stbjelcevic ,

thanks a lot for the response! I already did some research and have more questions... Will be very thankful for the further help:

GENIE MCP SERVER

Looks like this is the fastest solution. Created Genie is in general visible in the workspace where it was created and as an author I can define to which data it has access. So I am getting an URL which I can use in my agent:

https://{{host}}/api/2.0/mcp/genie/01f0caa969861e7fa4908e1555ea1492

With PAT authorization this is clear to me how databricks know that I have access to the given UC or not and that I can work with Genie MCP. However:

  • Is there a possibility to get authentication token without opening a browser? Full solution in the code.
  • Can I get token to to Genie MCP programatically with clientId and clientSecret or service principal? If so, how to do it from the code to generate the token.

Thank you a lot in advance!

 

 

 

 

stbjelcevic
Databricks Employee
Databricks Employee

Hi @maikel ,

Yes - look at the instructions here for programmatic ways to perform OAuth service principle auth: https://docs.databricks.com/aws/en/dev-tools/auth/oauth-m2m?language=Python#account-level-operations...

You will still need to initially create the service principle, client id, and secret id with the UI though. Once you have done that, the rest can be done with code.

maikel
New Contributor III

Perfect! Thank you very much! 🙂 

CharlotteMarti2
New Contributor III

"Hi Maikel, for deploying a custom MCP on Databricks:

You can deploy your own MCP server if you need external agent access and custom tools.

Use Unity Catalog APIs to query tables/volumes.

Authentication and permissions can be tied to Databricks tokens and Unity Catalog access.

Managed MCP may show empty tools if your catalog or permissions don’t expose them—custom MCP gives full control.

Start by setting up your MCP server in a Databricks cluster and expose it via a secure endpoint."*

Hubert-Dudek
Esteemed Contributor III

Databricks SQL warehouse acts also as MCP so maybe you don't need custom MCP

maikel
New Contributor III

@Hubert-Dudek 
To communicate with agent I need MCP protocol, so only SQL warehouse is not enough I assume. It has to be wrapped e.g. in FastMCP (to define tools resources etc.) and Fast API (if endpoints are required) and then deployed as custom MCP server. This is what I have done currently - MCP server is deployed in AWS and talking with databricks via REST API (warehouse is also used to get table data). However, I would like to improve latency and that is why I am asking about MCP directly in databricks.

Nevertheless, thank you all! I believe I have very good overview right now with all responses.

Hubert-Dudek
Esteemed Contributor III

@maikel I know that you need a more complicated approach, but actually SQL warehouse has MCP procool - here how I integrated it https://databrickster.medium.com/how-to-connect-chatgpt-to-your-databricks-sql-warehouse-2ad0b8e562b... but I think you can like Agent Brick - Multi Agent Supervisior (just it is only on few regions), check also other Agent Bricks / playground I used to develop similar use cases to yours just different way

maikel
New Contributor III

@Hubert-Dudek ! Okay, I did not know about it! Unfortunately in my databricks workspace a can see Agent Bricks as Coming Soon 😞