cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

158576
by New Contributor
  • 136 Views
  • 1 replies
  • 0 kudos

mount cifs volume on all purpose compute results in permission denied

I have all networking already set, nslookup resolves NAS server IP and connectivity is enabled from worker nodes to nas server. I am able to mount the same nas drive outside of databricks, I mean standalone linux vm in the same VPC where worker nodes...

  • 136 Views
  • 1 replies
  • 0 kudos
Latest Reply
siva-anantha
Contributor
  • 0 kudos

Hello, Could you provide with more information about why you want to attach NAS drive to Databricks cluster, please? I am no expert in Storage. As far as I understand, NAS will suffer with IO and Replication Bottlenecks, when attached to Distributed ...

  • 0 kudos
zaicnupagadi
by New Contributor II
  • 281 Views
  • 2 replies
  • 4 kudos

Resolved! Reaching out to Azure Storage with IP from Private VNET pool

Hey All,Is there a way for Databricks to reach out to Azure Storage using private endpoint?We would like no omit enabling access by "all trusted services".All resources are in the same VNET however when Databrics tries to reach out to Storage instead...

  • 281 Views
  • 2 replies
  • 4 kudos
Latest Reply
zaicnupagadi
New Contributor II
  • 4 kudos

Sorry for late reply - thank you for your help Nayan!

  • 4 kudos
1 More Replies
maikel
by New Contributor III
  • 584 Views
  • 4 replies
  • 1 kudos

Agent outside databricks communication with databricks MCP server

Hello Community!I have a following use case in my project:User -> AI agent -> MCP Server -> Databricks data from unity catalog.- AI agent is not created in the databricks- MCP server is created in the databricks and should expose tools to get data fr...

  • 584 Views
  • 4 replies
  • 1 kudos
Latest Reply
maikel
New Contributor III
  • 1 kudos

Hello @mark_ott thank you very much for this! It gives me a lot of knowledge! I think that since data is stored in databricks we will go with MCP deployed there as well.I have next portion of questions - can you please tell me how to deploy my mcp se...

  • 1 kudos
3 More Replies
Dharma25
by New Contributor III
  • 296 Views
  • 2 replies
  • 1 kudos

Task Hanging issue on DBR 15.4

Hello,I am running strucutred streaming pipeline with 5 models loaded using pyfunc.spark_udf. Lately we have been noticing very strange issue of tasks getting hanged and batch is taking very long time finishing its execution.CPU utilization is around...

Screenshot 2025-11-27 at 10.24.41 PM.png
  • 296 Views
  • 2 replies
  • 1 kudos
Latest Reply
bianca_unifeye
Contributor
  • 1 kudos

On DBR 15.4 the DeadlockDetector: TASK_HANGING message usually just means Spark has noticed some very long-running tasks and is checking for deadlocks. With multiple pyfunc.spark_udf models in a streaming query the tasks often appear “stuck” because ...

  • 1 kudos
1 More Replies
ismaelhenzel
by Contributor III
  • 491 Views
  • 4 replies
  • 3 kudos

Resolved! Asset bundle vs terraform

I would like to understand the differences between Terraform and Asset Bundles, especially since in some cases, they can do the same thing. I’m not talking about provisioning storage, networking, or the Databricks workspace itself—I know that is Terr...

  • 491 Views
  • 4 replies
  • 3 kudos
Latest Reply
Coffee77
Contributor III
  • 3 kudos

First, DAB uses terraform in the background. Having said that, my recommendation is to use DAB for whatever component already included and only other tools for IaC not supported yet or non-databricks specific (private VNets, external storages, etc.) ...

  • 3 kudos
3 More Replies
GeraldBriyolan
by New Contributor II
  • 279 Views
  • 3 replies
  • 0 kudos

Databricks Federated Token Exchange Returns HTML Login Page Instead of Access Token(GCP →Databricks)

Hi everyone,I’m trying to implement federated authentication (token exchange) from Google Cloud → Databricks without using a client ID / client secret only using a Google-issued service account token. I have also created a federation policy in Databr...

GeraldBriyolan_0-1764050266136.png
Administration & Architecture
Federation Policy
GCP
Token exchange
  • 279 Views
  • 3 replies
  • 0 kudos
Latest Reply
WiliamRosa
Contributor III
  • 0 kudos

You might want to check whether the issue is related to your federation policy configuration.Try reviewing the following documentation to confirm that your policy is correctly set up (issuer, audiences, and other expected claims):https://docs.databri...

  • 0 kudos
2 More Replies
rpl
by Contributor
  • 4259 Views
  • 14 replies
  • 2 kudos

Resolved! Which API to use to list groups in which a given user is a member

Is there an API that can be used to list groups in which a given user is a member? Specifically, I’d be interested in account (not workspace) groups.It seems there used to be a workspace-level list-parents API referred to in the answers to this quest...

  • 4259 Views
  • 14 replies
  • 2 kudos
Latest Reply
noorbasha534
Valued Contributor II
  • 2 kudos

there will be 2 system tables soon - users, groups. Probably, that makes life easy. I have already requested the Databricks RAS meant for my customer to enabled these for our control plane

  • 2 kudos
13 More Replies
noorbasha534
by Valued Contributor II
  • 313 Views
  • 3 replies
  • 0 kudos

Azure VM quota for databricks jobs - demand prediction

Hey folks,a quick check -wanted to gather thoughts on how you manage demand for azure VM quota so you don't run into quota limits issues.In our case, we have several data domains (finance, master data, supply chain...) executing their projects in Dat...

  • 313 Views
  • 3 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Yes, Azure Databricks compute policies let you define “quota-like” limits, but only within Databricks, not Azure subscription quotas themselves. You still rely on Azure’s own quota system for vCPU/VM core limits at the subscription level.​ What you c...

  • 0 kudos
2 More Replies
hdelas
by New Contributor
  • 180 Views
  • 1 replies
  • 0 kudos

Deploying Jobs in Databricks

How can I use the Databricks Python SDK from azure devops to create or update a job and explicitly assign it to a cluster policy (by policy ID or name)? Could you show me an example where the job definition includes a task and a job cluster that refe...

  • 180 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

  To use the Databricks Python SDK from Azure DevOps to create or update a job and assign it explicitly to a cluster policy, specify the cluster policy by its ID in the job cluster section of your job definition. This ensures the cluster spawned for ...

  • 0 kudos
Victor2
by New Contributor
  • 248 Views
  • 1 replies
  • 0 kudos

Unable to create connection in Power platform

When i try to create the connection, I get the error message "Connection test failed. Please review your configuration and try again."Here is the response in the network trace:My connection credentials are correct. So, i'm not sure what i am doing wr...

Victor2_1-1764022073345.png Victor2_4-1764022154704.png
  • 248 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The error message "Connection test failed. Please review your configuration and try again." when connecting Databricks to Power Platform can occur due to several common issues, even if your credentials are correct. Key Troubleshooting Steps Double-c...

  • 0 kudos
GeraldBriyolan
by New Contributor II
  • 177 Views
  • 1 replies
  • 0 kudos

Need to create an Identity Federation between my Databricks workspace/account and my GCP account

I am trying to authenticate my Databricks account using the Federation for fetching the data. I have created a service account in GCP, and also using Google Auth, I have generated a token, but I don't know how to exchange the token to authenticate Da...

GeraldBriyolan_0-1763968122617.png
Administration & Architecture
Databricks
Federation Policy
GCP
  • 177 Views
  • 1 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

Hi @GeraldBriyolan , You may need to use a Google ID Token to do what you are trying to do: https://docs.databricks.com/gcp/en/dev-tools/auth/authentication-google-id  

  • 0 kudos
old_school
by New Contributor II
  • 222 Views
  • 3 replies
  • 0 kudos

Cap on OIDC (max 20) Enable workload identity federation for GitHub Actions

Hi Databricks community,I have followed below page and created Github OIDCs but there seems to be a cap on how many OIDC's a Service Principal can create (20 max). Is there any work around for this or some other solution apart from using Client ID an...

  • 222 Views
  • 3 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

I can't speak for specifically why, but allowing wildcards creates security risks and most identity providers and standards guidance require exact, pre-registered URLs.

  • 0 kudos
2 More Replies
Raman_Unifeye
by Contributor III
  • 354 Views
  • 5 replies
  • 3 kudos

Prevent Access to AI Functions Execution

As a workspace Admin, I want to prevent unexpected API costs from unrestricted usage of AI Functions (AI_QUERY() etc.), how can we control that only a particular group-users can execute AI Functions ?I understand the function execution cost can be vi...

  • 354 Views
  • 5 replies
  • 3 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 3 kudos

ok, so it has to be done at individual end-point and function level 

  • 3 kudos
4 More Replies
kashif_dev
by New Contributor
  • 291 Views
  • 1 replies
  • 0 kudos

Azure DB Workspace Not Connected to DB Account Unity Catalog & Admin Console Missing (identity=null)

Hi team,I created a brand-new Azure environment and an Azure Databricks workspace, but the workspace appears to be in classic (legacy) mode and is not connected to a Databricks Account, so Unity Catalog cannot be enabled.Below are all the details and...

  • 291 Views
  • 1 replies
  • 0 kudos
Latest Reply
Coffee77
Contributor III
  • 0 kudos

I think you need a "corporate" account with Azure Global Administrator role to enable/access Databricks account. For instance, in some of my demo workspaces I can't access to UC with my "hotmail" account. I haven't looked deeper into it so far. So, a...

  • 0 kudos