cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Dharma25
by New Contributor III
  • 286 Views
  • 2 replies
  • 1 kudos

Task Hanging issue on DBR 15.4

Hello,I am running strucutred streaming pipeline with 5 models loaded using pyfunc.spark_udf. Lately we have been noticing very strange issue of tasks getting hanged and batch is taking very long time finishing its execution.CPU utilization is around...

Screenshot 2025-11-27 at 10.24.41 PM.png
  • 286 Views
  • 2 replies
  • 1 kudos
Latest Reply
bianca_unifeye
Contributor
  • 1 kudos

On DBR 15.4 the DeadlockDetector: TASK_HANGING message usually just means Spark has noticed some very long-running tasks and is checking for deadlocks. With multiple pyfunc.spark_udf models in a streaming query the tasks often appear “stuck” because ...

  • 1 kudos
1 More Replies
ismaelhenzel
by Contributor III
  • 465 Views
  • 4 replies
  • 3 kudos

Resolved! Asset bundle vs terraform

I would like to understand the differences between Terraform and Asset Bundles, especially since in some cases, they can do the same thing. I’m not talking about provisioning storage, networking, or the Databricks workspace itself—I know that is Terr...

  • 465 Views
  • 4 replies
  • 3 kudos
Latest Reply
Coffee77
Contributor III
  • 3 kudos

First, DAB uses terraform in the background. Having said that, my recommendation is to use DAB for whatever component already included and only other tools for IaC not supported yet or non-databricks specific (private VNets, external storages, etc.) ...

  • 3 kudos
3 More Replies
GeraldBriyolan
by New Contributor II
  • 271 Views
  • 3 replies
  • 0 kudos

Databricks Federated Token Exchange Returns HTML Login Page Instead of Access Token(GCP →Databricks)

Hi everyone,I’m trying to implement federated authentication (token exchange) from Google Cloud → Databricks without using a client ID / client secret only using a Google-issued service account token. I have also created a federation policy in Databr...

GeraldBriyolan_0-1764050266136.png
Administration & Architecture
Federation Policy
GCP
Token exchange
  • 271 Views
  • 3 replies
  • 0 kudos
Latest Reply
WiliamRosa
Contributor III
  • 0 kudos

You might want to check whether the issue is related to your federation policy configuration.Try reviewing the following documentation to confirm that your policy is correctly set up (issuer, audiences, and other expected claims):https://docs.databri...

  • 0 kudos
2 More Replies
rpl
by Contributor
  • 4242 Views
  • 14 replies
  • 2 kudos

Resolved! Which API to use to list groups in which a given user is a member

Is there an API that can be used to list groups in which a given user is a member? Specifically, I’d be interested in account (not workspace) groups.It seems there used to be a workspace-level list-parents API referred to in the answers to this quest...

  • 4242 Views
  • 14 replies
  • 2 kudos
Latest Reply
noorbasha534
Valued Contributor II
  • 2 kudos

there will be 2 system tables soon - users, groups. Probably, that makes life easy. I have already requested the Databricks RAS meant for my customer to enabled these for our control plane

  • 2 kudos
13 More Replies
noorbasha534
by Valued Contributor II
  • 303 Views
  • 3 replies
  • 0 kudos

Azure VM quota for databricks jobs - demand prediction

Hey folks,a quick check -wanted to gather thoughts on how you manage demand for azure VM quota so you don't run into quota limits issues.In our case, we have several data domains (finance, master data, supply chain...) executing their projects in Dat...

  • 303 Views
  • 3 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Yes, Azure Databricks compute policies let you define “quota-like” limits, but only within Databricks, not Azure subscription quotas themselves. You still rely on Azure’s own quota system for vCPU/VM core limits at the subscription level.​ What you c...

  • 0 kudos
2 More Replies
hdelas
by New Contributor
  • 177 Views
  • 1 replies
  • 0 kudos

Deploying Jobs in Databricks

How can I use the Databricks Python SDK from azure devops to create or update a job and explicitly assign it to a cluster policy (by policy ID or name)? Could you show me an example where the job definition includes a task and a job cluster that refe...

  • 177 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

  To use the Databricks Python SDK from Azure DevOps to create or update a job and assign it explicitly to a cluster policy, specify the cluster policy by its ID in the job cluster section of your job definition. This ensures the cluster spawned for ...

  • 0 kudos
Victor2
by New Contributor
  • 239 Views
  • 1 replies
  • 0 kudos

Unable to create connection in Power platform

When i try to create the connection, I get the error message "Connection test failed. Please review your configuration and try again."Here is the response in the network trace:My connection credentials are correct. So, i'm not sure what i am doing wr...

Victor2_1-1764022073345.png Victor2_4-1764022154704.png
  • 239 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The error message "Connection test failed. Please review your configuration and try again." when connecting Databricks to Power Platform can occur due to several common issues, even if your credentials are correct. Key Troubleshooting Steps Double-c...

  • 0 kudos
GeraldBriyolan
by New Contributor II
  • 172 Views
  • 1 replies
  • 0 kudos

Need to create an Identity Federation between my Databricks workspace/account and my GCP account

I am trying to authenticate my Databricks account using the Federation for fetching the data. I have created a service account in GCP, and also using Google Auth, I have generated a token, but I don't know how to exchange the token to authenticate Da...

GeraldBriyolan_0-1763968122617.png
Administration & Architecture
Databricks
Federation Policy
GCP
  • 172 Views
  • 1 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

Hi @GeraldBriyolan , You may need to use a Google ID Token to do what you are trying to do: https://docs.databricks.com/gcp/en/dev-tools/auth/authentication-google-id  

  • 0 kudos
old_school
by New Contributor II
  • 213 Views
  • 3 replies
  • 0 kudos

Cap on OIDC (max 20) Enable workload identity federation for GitHub Actions

Hi Databricks community,I have followed below page and created Github OIDCs but there seems to be a cap on how many OIDC's a Service Principal can create (20 max). Is there any work around for this or some other solution apart from using Client ID an...

  • 213 Views
  • 3 replies
  • 0 kudos
Latest Reply
stbjelcevic
Databricks Employee
  • 0 kudos

I can't speak for specifically why, but allowing wildcards creates security risks and most identity providers and standards guidance require exact, pre-registered URLs.

  • 0 kudos
2 More Replies
Raman_Unifeye
by Contributor III
  • 342 Views
  • 5 replies
  • 3 kudos

Prevent Access to AI Functions Execution

As a workspace Admin, I want to prevent unexpected API costs from unrestricted usage of AI Functions (AI_QUERY() etc.), how can we control that only a particular group-users can execute AI Functions ?I understand the function execution cost can be vi...

  • 342 Views
  • 5 replies
  • 3 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 3 kudos

ok, so it has to be done at individual end-point and function level 

  • 3 kudos
4 More Replies
kashif_dev
by New Contributor
  • 282 Views
  • 1 replies
  • 0 kudos

Azure DB Workspace Not Connected to DB Account Unity Catalog & Admin Console Missing (identity=null)

Hi team,I created a brand-new Azure environment and an Azure Databricks workspace, but the workspace appears to be in classic (legacy) mode and is not connected to a Databricks Account, so Unity Catalog cannot be enabled.Below are all the details and...

  • 282 Views
  • 1 replies
  • 0 kudos
Latest Reply
Coffee77
Contributor III
  • 0 kudos

I think you need a "corporate" account with Azure Global Administrator role to enable/access Databricks account. For instance, in some of my demo workspaces I can't access to UC with my "hotmail" account. I haven't looked deeper into it so far. So, a...

  • 0 kudos
lubiarzm1
by New Contributor III
  • 571 Views
  • 7 replies
  • 1 kudos

Resolved! Deployment of private databricks workspace.

I tried to create configuration of Databricks with Vlan injection and I faced few problem during deploymen.1. I tried to deploy my workspace using IaC and terraform. Whole time I face issue with NSG even when I create configuration as follow in this ...

lubiarzm1_0-1763718040279.png
  • 571 Views
  • 7 replies
  • 1 kudos
Latest Reply
lubiarzm1
New Contributor III
  • 1 kudos

All issues was resolvedReady to deploy codelocals { default_tags = { terraform = "true" workload = var.app env = var.environment } } resource "azurerm_databricks_access_connector...

  • 1 kudos
6 More Replies
tinodj
by New Contributor II
  • 301 Views
  • 4 replies
  • 0 kudos

Real-time output missing when using “Upload and Run File” from VS Code

I am running Python files on a Databricks cluster using the VS Code Databricks extension, specifically the “Upload and Run File” command.I cannot get real-time output in the Debug Console. I have checked the official docs:https://learn.microsoft.com/...

  • 301 Views
  • 4 replies
  • 0 kudos
Latest Reply
tinodj
New Contributor II
  • 0 kudos

Yes, prints and loggings are viewable in driver logs as they happen. If the same file is run in databricks Web UI they are viewable on output window as they happen as well. But, when run through VS code, unfortunately they are not visible in the debu...

  • 0 kudos
3 More Replies
DArcher
by New Contributor
  • 190 Views
  • 1 replies
  • 0 kudos

Printing Notebook Dashboards

Is it possible to print the tables in a notebook dashboard to a PDF?  I have about 10 tables for stratifications in a dashboard that would be great to print all at once into a clean pdf report.

  • 190 Views
  • 1 replies
  • 0 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 0 kudos

Hi @DArcher - are you using legacy dashboard or modern Lakeview(AI/BI) dashboard?In the legacy one via notebook, there is no such direct way to export. You will have to do custom python script perhaps to write the output in html and then print to PDF...

  • 0 kudos
ashish_modi
by New Contributor II
  • 1818 Views
  • 6 replies
  • 2 kudos

cloud_infra_costs

I was looking at the system catalog and realized that there is an empty table called cloud_infra_costs. Could you tell me what is this for and why it is empty?  

ashish_modi_0-1730131553537.png
  • 1818 Views
  • 6 replies
  • 2 kudos
Latest Reply
Coffee77
Contributor III
  • 2 kudos

You can also take a look at this built-in cost control dashboard explained in the below video or official databricks documentation at https://docs.databricks.com/aws/en/admin/usage/ . Concerning the dashboard, relevant subject for me was you can insp...

  • 2 kudos
5 More Replies