- 286 Views
- 2 replies
- 1 kudos
Task Hanging issue on DBR 15.4
Hello,I am running strucutred streaming pipeline with 5 models loaded using pyfunc.spark_udf. Lately we have been noticing very strange issue of tasks getting hanged and batch is taking very long time finishing its execution.CPU utilization is around...
- 286 Views
- 2 replies
- 1 kudos
- 1 kudos
On DBR 15.4 the DeadlockDetector: TASK_HANGING message usually just means Spark has noticed some very long-running tasks and is checking for deadlocks. With multiple pyfunc.spark_udf models in a streaming query the tasks often appear “stuck” because ...
- 1 kudos
- 465 Views
- 4 replies
- 3 kudos
Resolved! Asset bundle vs terraform
I would like to understand the differences between Terraform and Asset Bundles, especially since in some cases, they can do the same thing. I’m not talking about provisioning storage, networking, or the Databricks workspace itself—I know that is Terr...
- 465 Views
- 4 replies
- 3 kudos
- 3 kudos
First, DAB uses terraform in the background. Having said that, my recommendation is to use DAB for whatever component already included and only other tools for IaC not supported yet or non-databricks specific (private VNets, external storages, etc.) ...
- 3 kudos
- 271 Views
- 3 replies
- 0 kudos
Databricks Federated Token Exchange Returns HTML Login Page Instead of Access Token(GCP →Databricks)
Hi everyone,I’m trying to implement federated authentication (token exchange) from Google Cloud → Databricks without using a client ID / client secret only using a Google-issued service account token. I have also created a federation policy in Databr...
- 271 Views
- 3 replies
- 0 kudos
- 0 kudos
You might want to check whether the issue is related to your federation policy configuration.Try reviewing the following documentation to confirm that your policy is correctly set up (issuer, audiences, and other expected claims):https://docs.databri...
- 0 kudos
- 4242 Views
- 14 replies
- 2 kudos
Resolved! Which API to use to list groups in which a given user is a member
Is there an API that can be used to list groups in which a given user is a member? Specifically, I’d be interested in account (not workspace) groups.It seems there used to be a workspace-level list-parents API referred to in the answers to this quest...
- 4242 Views
- 14 replies
- 2 kudos
- 2 kudos
there will be 2 system tables soon - users, groups. Probably, that makes life easy. I have already requested the Databricks RAS meant for my customer to enabled these for our control plane
- 2 kudos
- 303 Views
- 3 replies
- 0 kudos
Azure VM quota for databricks jobs - demand prediction
Hey folks,a quick check -wanted to gather thoughts on how you manage demand for azure VM quota so you don't run into quota limits issues.In our case, we have several data domains (finance, master data, supply chain...) executing their projects in Dat...
- 303 Views
- 3 replies
- 0 kudos
- 0 kudos
Yes, Azure Databricks compute policies let you define “quota-like” limits, but only within Databricks, not Azure subscription quotas themselves. You still rely on Azure’s own quota system for vCPU/VM core limits at the subscription level. What you c...
- 0 kudos
- 177 Views
- 1 replies
- 0 kudos
Deploying Jobs in Databricks
How can I use the Databricks Python SDK from azure devops to create or update a job and explicitly assign it to a cluster policy (by policy ID or name)? Could you show me an example where the job definition includes a task and a job cluster that refe...
- 177 Views
- 1 replies
- 0 kudos
- 0 kudos
To use the Databricks Python SDK from Azure DevOps to create or update a job and assign it explicitly to a cluster policy, specify the cluster policy by its ID in the job cluster section of your job definition. This ensures the cluster spawned for ...
- 0 kudos
- 239 Views
- 1 replies
- 0 kudos
Unable to create connection in Power platform
When i try to create the connection, I get the error message "Connection test failed. Please review your configuration and try again."Here is the response in the network trace:My connection credentials are correct. So, i'm not sure what i am doing wr...
- 239 Views
- 1 replies
- 0 kudos
- 0 kudos
The error message "Connection test failed. Please review your configuration and try again." when connecting Databricks to Power Platform can occur due to several common issues, even if your credentials are correct. Key Troubleshooting Steps Double-c...
- 0 kudos
- 172 Views
- 1 replies
- 0 kudos
Need to create an Identity Federation between my Databricks workspace/account and my GCP account
I am trying to authenticate my Databricks account using the Federation for fetching the data. I have created a service account in GCP, and also using Google Auth, I have generated a token, but I don't know how to exchange the token to authenticate Da...
- 172 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @GeraldBriyolan , You may need to use a Google ID Token to do what you are trying to do: https://docs.databricks.com/gcp/en/dev-tools/auth/authentication-google-id
- 0 kudos
- 213 Views
- 3 replies
- 0 kudos
Cap on OIDC (max 20) Enable workload identity federation for GitHub Actions
Hi Databricks community,I have followed below page and created Github OIDCs but there seems to be a cap on how many OIDC's a Service Principal can create (20 max). Is there any work around for this or some other solution apart from using Client ID an...
- 213 Views
- 3 replies
- 0 kudos
- 0 kudos
I can't speak for specifically why, but allowing wildcards creates security risks and most identity providers and standards guidance require exact, pre-registered URLs.
- 0 kudos
- 342 Views
- 5 replies
- 3 kudos
Prevent Access to AI Functions Execution
As a workspace Admin, I want to prevent unexpected API costs from unrestricted usage of AI Functions (AI_QUERY() etc.), how can we control that only a particular group-users can execute AI Functions ?I understand the function execution cost can be vi...
- 342 Views
- 5 replies
- 3 kudos
- 3 kudos
ok, so it has to be done at individual end-point and function level
- 3 kudos
- 282 Views
- 1 replies
- 0 kudos
Azure DB Workspace Not Connected to DB Account Unity Catalog & Admin Console Missing (identity=null)
Hi team,I created a brand-new Azure environment and an Azure Databricks workspace, but the workspace appears to be in classic (legacy) mode and is not connected to a Databricks Account, so Unity Catalog cannot be enabled.Below are all the details and...
- 282 Views
- 1 replies
- 0 kudos
- 0 kudos
I think you need a "corporate" account with Azure Global Administrator role to enable/access Databricks account. For instance, in some of my demo workspaces I can't access to UC with my "hotmail" account. I haven't looked deeper into it so far. So, a...
- 0 kudos
- 571 Views
- 7 replies
- 1 kudos
Resolved! Deployment of private databricks workspace.
I tried to create configuration of Databricks with Vlan injection and I faced few problem during deploymen.1. I tried to deploy my workspace using IaC and terraform. Whole time I face issue with NSG even when I create configuration as follow in this ...
- 571 Views
- 7 replies
- 1 kudos
- 1 kudos
All issues was resolvedReady to deploy codelocals { default_tags = { terraform = "true" workload = var.app env = var.environment } } resource "azurerm_databricks_access_connector...
- 1 kudos
- 301 Views
- 4 replies
- 0 kudos
Real-time output missing when using “Upload and Run File” from VS Code
I am running Python files on a Databricks cluster using the VS Code Databricks extension, specifically the “Upload and Run File” command.I cannot get real-time output in the Debug Console. I have checked the official docs:https://learn.microsoft.com/...
- 301 Views
- 4 replies
- 0 kudos
- 0 kudos
Yes, prints and loggings are viewable in driver logs as they happen. If the same file is run in databricks Web UI they are viewable on output window as they happen as well. But, when run through VS code, unfortunately they are not visible in the debu...
- 0 kudos
- 190 Views
- 1 replies
- 0 kudos
Printing Notebook Dashboards
Is it possible to print the tables in a notebook dashboard to a PDF? I have about 10 tables for stratifications in a dashboard that would be great to print all at once into a clean pdf report.
- 190 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @DArcher - are you using legacy dashboard or modern Lakeview(AI/BI) dashboard?In the legacy one via notebook, there is no such direct way to export. You will have to do custom python script perhaps to write the output in html and then print to PDF...
- 0 kudos
- 1818 Views
- 6 replies
- 2 kudos
cloud_infra_costs
I was looking at the system catalog and realized that there is an empty table called cloud_infra_costs. Could you tell me what is this for and why it is empty?
- 1818 Views
- 6 replies
- 2 kudos
- 2 kudos
You can also take a look at this built-in cost control dashboard explained in the below video or official databricks documentation at https://docs.databricks.com/aws/en/admin/usage/ . Concerning the dashboard, relevant subject for me was you can insp...
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
59 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |