- 571 Views
- 3 replies
- 0 kudos
databricks terraform provider, databricks_credential resource, service
I cannot make the databricks_credential resource create a service credential. It works fine with storage credentials. However, when i put `purpose = "SERVICE"` plus aws_iam_role and comment, in the apply phase it fails with `Error: cannot create cred...
- 571 Views
- 3 replies
- 0 kudos
- 0 kudos
I have the same error message now when trying to create a USE_SCHEMA grant for a service principal as in https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/grant#schema-grants . I create a new service principal and th...
- 0 kudos
- 30215 Views
- 13 replies
- 14 kudos
Resolved! Installing libraries on job clusters
Simple question : what is the way to go to install libraries on job clusters ? There does not seem to be a "Libraries" tab on the UI as opposed to regular clusters. Does it mean that the only option is to use init scripts ?
- 30215 Views
- 13 replies
- 14 kudos
- 14 kudos
Frankly, I'm also scratching my head about this library installation conundrum in job clusters; the missing Libraries tab is definitely throwing me off. Seems like init scripts are the only officially sanctioned route, which feels a bit clunky. Are t...
- 14 kudos
- 712 Views
- 3 replies
- 0 kudos
Databricks Execution Context scheduling
Databricks allows a cluster to have a maximum of 150 execution context. If a number of execution context is attached and running operations on a databricks cluster with a driver and n workers, then how is the execution scheduled? I am assuming that o...
- 712 Views
- 3 replies
- 0 kudos
- 0 kudos
Ok i get it.An execution context is not bound to cpu. It is like a session. So basically the limit of 150 execution contexts mean that 150 sessions/spark programs can run simultaneously on the cluster (whether that is possible on the hardware is an...
- 0 kudos
- 1000 Views
- 2 replies
- 1 kudos
Resolved! AWS Account Level provider "databricks" Authentication
I am trying to deploy an additional workspace to my AWS Account with terraform, I keep running into this authentication an error message below, I have searched everywhere in my account console and used chatgpt which keep point me to the account con...
- 1000 Views
- 2 replies
- 1 kudos
- 5030 Views
- 5 replies
- 6 kudos
Resolved! Issue with Databricks Alerts formatting sent to Microsoft Teams
Hello all,As of Thursday 20th 2025, we started to have our alert notifications being sent via Microsoft Teams with no formatting (raw HTML).We use a custom template with QUERY_RESULT_TABLE and a few html tags.Looks like databricks alerts deprecated ...
- 5030 Views
- 5 replies
- 6 kudos
- 6 kudos
Hello,Thanks for sharing the explanation and link to the Microsoft's documentation.We had removed the HTML formatting a few months ago.
- 6 kudos
- 1667 Views
- 5 replies
- 5 kudos
Resolved! Unable to Create Cluster in ADW Deployment — CONTROL_PLANE_REQUEST_FAILURE
I'm running into an issue with my Databricks workspace in Azure in my own VNet. I've successfully created two private endpoints: databricks_ui_api and browser_authenticationHowever, when I try to create a cluster, I get the following error:CONTROL_PL...
- 1667 Views
- 5 replies
- 5 kudos
- 5 kudos
Hello @eshwari Good day, I think szymon_dybczak and me provided enough information, please let me know if you had the solutions.If you found the solution useful, you can select the solution for the solution which helps others. Have a greate day!
- 5 kudos
- 1553 Views
- 3 replies
- 2 kudos
Resolved! GA for AIM
Do anyone know when Automatic Identity Management will be generally available for all the users?Announcing Automatic Identity Management for Azure Databricks | Databricks Blog
- 1553 Views
- 3 replies
- 2 kudos
- 2 kudos
Thanks for the clarification!I manage a fairly large Databricks environment, so enabling features that are still in Public Preview can be a bit challenging from a governance and stability standpoint. If possible, could you share any tentative timelin...
- 2 kudos
- 513 Views
- 1 replies
- 0 kudos
PATs sharing in a global data platform
Hello allChecking on how others implement sharing of Databricks personal access tokens for authentication wherein you have atleast 25+ different technologies extracting data via SQL warehouses ((imagine a global data platform that hosts data for usag...
- 513 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @noorbasha534 Honestly, I really understand your pain around token management. I face the same situation myself and it can definitely become a headache, especially when you have multiple technologies in play, some of them open-source, and even ca...
- 0 kudos
- 771 Views
- 2 replies
- 0 kudos
Webhooks for Microsoft Teams are not being received from Databricks jobs
We have configured a couple of webhooks in Teams channels and added the URLs to Databricks under > Settings> notifications. But our jobs do not post anything into the Teams channels.This used to work but is now not doing anything.
- 771 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello marcusfox, I encounter the same issue as u did. If u already solved it, could u please give the tutorial here please? Thank u in advance for guideline
- 0 kudos
- 2239 Views
- 6 replies
- 1 kudos
Resolved! Compute cluster in Azure workspace is unable to access Unity Catalog volume on storage account
Hi,I'm setting up a workspace in Azure with VNet injection. I'm able to upload files to a Unity Catalog managed storage account volume through the web UI, and access them from notebooks using serverless compute, for example, `dbutils.fs.list("/Volume...
- 2239 Views
- 6 replies
- 1 kudos
- 1 kudos
Could the classic cluster still be using the old ABFS driver instead of the managed identity?
- 1 kudos
- 861 Views
- 3 replies
- 3 kudos
Resolved! Missing configured "sql" scope in Databricks Apps User Token
I have User authorization for apps enabled in my workspace.I have added the sql scope to my app. However, when making sql queries to my app, authorization errors ensue:Error during request to server: : Provided OAuth token does not have required scop...
- 861 Views
- 3 replies
- 3 kudos
- 3 kudos
@Advika thanks. It looks like this was only a temporary issue; I had already restarted the app, but today it is working. I will mark your answer as accepted. The problem may have been due to recreating the app (using bundles), which reset the user sc...
- 3 kudos
- 751 Views
- 1 replies
- 0 kudos
Databricks Publish to PowerBI feature - Security aspect
Can someone please explain what access databricks requires to publish UC tables to powerBI service. In above snapshot I see it says read all workspace - so these are PBI workspace or all databricks workspace?If I enable this request, will the publish...
- 751 Views
- 1 replies
- 0 kudos
- 0 kudos
@bharatn at the bottom of you picture, it says "Show Details". perhaps clicking on that will provide some of the granularity you're looking for. If it's DB requesting to Microsoft, it'll be DB being able to see the PBI workspaces. I think the bottom...
- 0 kudos
- 703 Views
- 2 replies
- 3 kudos
Resolved! Databricks Free Edition - compute does not start
Hello,I am using the Databricks Free Edition, but for the past couple of days, I have been facing a problem where I am unable to start the serverless compute. It takes forever to initiate. When I check the log, I see the message that the compute <......
- 703 Views
- 2 replies
- 3 kudos
- 3 kudos
Hello @JrVerbiest! Thanks for sharing this, @BS_THE_ANALYST.This is a known issue with the Free Edition, and the team is already aware and working on a fix. It should be resolved soon.Thanks for your patience in the meantime!
- 3 kudos
- 1182 Views
- 6 replies
- 5 kudos
Resolved! Datbricks CLI is encoding the secrets in base 64 automatically
Hello Everyone,I am using my MAc terminal to save a databricks primary key using a specific scope and secret.Now everything runs smoothly ! except when I get to the step where I generate a secret. Problem is that my primary key is for example "test12...
- 1182 Views
- 6 replies
- 5 kudos
- 5 kudos
Hi @spearitchmeta ,Yes, you're right. In order to read the value of a secret using the Databricks CLI, you must decode the base64 encoded value. You can use jq to extract the value and base --decode to decode it:databricks secrets get-secret <scope-n...
- 5 kudos
- 916 Views
- 3 replies
- 0 kudos
how to parallel n number of process in databricks
Requirement: I have a volume in which random txt file coming from MQ with random numbers. In my workspace I have python script. Also, i have created job which, when new file will come in volume it will trigger automatically. My requirement is, I need...
- 916 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello @jitenjha11 : You can do it same manner they way it has highlighted by @MujtabaNoori but you have to call the process process twice.Sharing the sample reference code below :Iterating through the files in each directory. for directory in direct...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
56 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 114 | |
| 37 | |
| 34 | |
| 26 | |
| 25 |