a week ago
Hi Community!
I have a question - could somebody please guide me how to deploy my custom MCP server to databricks?
What I would like to achieve is the following:
How can I authenticate for this MCP?
How can I decide whether given use has permissions to use this MCP? Is it connected with access to the given unity catalog somehow?
Shall I create my custom mcp server or I can use Managed MCP already?
Today I have tried to connect to Managed MCP for one of my catalogs but the list of tools is empty there.
Thanks a lot in advance!
yesterday
Hi @maikel ,
Yes - look at the instructions here for programmatic ways to perform OAuth service principle auth: https://docs.databricks.com/aws/en/dev-tools/auth/oauth-m2m?language=Python#account-level-operations...
You will still need to initially create the service principle, client id, and secret id with the UI though. Once you have done that, the rest can be done with code.
Monday
Hi @maikel ,
Custom hosted MCP servers are implemented as a Databricks App - the documentation here should help you get started: https://docs.databricks.com/aws/en/generative-ai/mcp/custom-mcp
To answer some of your questions more specifically:
Tables: If you want to just access tables, you can use managed DBSQL MCP server to run SQL across Unity Catalog tables, or use the managed Genie MCP server for natural-language queries over structured data. You can also expose deterministic tools as Unity Catalog Functions and use the managed UC Functions MCP server to run them.
Volumes: Databricks does not currently expose Unity Catalog Volumes as first‑class managed MCP tools, but it may be something in the future. Today, you can either model the needed actions as Unity Catalog Functions (e.g., file read/parse helpers) or host a custom MCP server on Databricks Apps that implements volume operations under your own API contract.
Should you build a custom MCP server or use Managed MCP?
My recommendation would be to start with managed because it should be easier to get the ball rolling. If you find that you can't do everything with managed, you can switch to custom.
yesterday
Hi @stbjelcevic ,
thanks a lot for the response! I already did some research and have more questions... Will be very thankful for the further help:
GENIE MCP SERVER
Looks like this is the fastest solution. Created Genie is in general visible in the workspace where it was created and as an author I can define to which data it has access. So I am getting an URL which I can use in my agent:
https://{{host}}/api/2.0/mcp/genie/01f0caa969861e7fa4908e1555ea1492
With PAT authorization this is clear to me how databricks know that I have access to the given UC or not and that I can work with Genie MCP. However:
Thank you a lot in advance!
yesterday
Hi @maikel ,
Yes - look at the instructions here for programmatic ways to perform OAuth service principle auth: https://docs.databricks.com/aws/en/dev-tools/auth/oauth-m2m?language=Python#account-level-operations...
You will still need to initially create the service principle, client id, and secret id with the UI though. Once you have done that, the rest can be done with code.
yesterday
Perfect! Thank you very much! 🙂
Monday
"Hi Maikel, for deploying a custom MCP on Databricks:
You can deploy your own MCP server if you need external agent access and custom tools.
Use Unity Catalog APIs to query tables/volumes.
Authentication and permissions can be tied to Databricks tokens and Unity Catalog access.
Managed MCP may show empty tools if your catalog or permissions don’t expose them—custom MCP gives full control.
Start by setting up your MCP server in a Databricks cluster and expose it via a secure endpoint."*
yesterday
Databricks SQL warehouse acts also as MCP so maybe you don't need custom MCP
yesterday
@Hubert-Dudek
To communicate with agent I need MCP protocol, so only SQL warehouse is not enough I assume. It has to be wrapped e.g. in FastMCP (to define tools resources etc.) and Fast API (if endpoints are required) and then deployed as custom MCP server. This is what I have done currently - MCP server is deployed in AWS and talking with databricks via REST API (warehouse is also used to get table data). However, I would like to improve latency and that is why I am asking about MCP directly in databricks.
Nevertheless, thank you all! I believe I have very good overview right now with all responses.
yesterday
@maikel I know that you need a more complicated approach, but actually SQL warehouse has MCP procool - here how I integrated it https://databrickster.medium.com/how-to-connect-chatgpt-to-your-databricks-sql-warehouse-2ad0b8e562b... but I think you can like Agent Brick - Multi Agent Supervisior (just it is only on few regions), check also other Agent Bricks / playground I used to develop similar use cases to yours just different way
yesterday
@Hubert-Dudek ! Okay, I did not know about it! Unfortunately in my databricks workspace a can see Agent Bricks as Coming Soon 😞
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now