- 21068 Views
- 26 replies
- 19 kudos
how do you disable serverless interactive compute for all users
I don't want users using serverless interactive compute for their jobs. how do i disable it for everyone or for specific users
- 21068 Views
- 26 replies
- 19 kudos
- 19 kudos
Btw. I just realized that at least with VNet injected workspaces you probably can prevent any sensible serverless usage by not giving permissions and network route to the needed resources. At least in Azure Databricks, notebooks need access to Databr...
- 19 kudos
- 4596 Views
- 15 replies
- 14 kudos
Resolved! I need a switch to turn off Data Apps in databricks workspaces
HiHow do I disable Data Apps on my workspace. This is really annoying that Databricks pushes new features without any option to disable them. At least you should have some tools to control access before rolling it out. It seems you only care about fe...
- 4596 Views
- 15 replies
- 14 kudos
- 14 kudos
I'm an account level admin and can't find any options manage creation access of Databricks Apps in the account console. Where can I limit app creation in the Account Console?
- 14 kudos
- 344 Views
- 2 replies
- 1 kudos
Resolved! Logs from dlt-execution computes
Hi guys!I faced an issue with the permission model in Databricks.Data engineers in my team are using a Pipeline that runs on serverless compute. The permissions for the pipeline are configured correctly, for example as follows:resource "databricks_pe...
- 344 Views
- 2 replies
- 1 kudos
- 306 Views
- 1 replies
- 1 kudos
Resolved! Naming covention guidelines
Dear all,Appreciate if anyone can provide the document/source related to Naming convention that can be followed within Databricks. Regards,Aijaz
- 306 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @miraijaz , Databricks doesn’t enforce a single enterprise-wide naming standard, but there are a few official/public guidelines you can lean on. See the "Names" section of the SQL language reference.This covers allowed characters, length limits, a...
- 1 kudos
- 301 Views
- 3 replies
- 0 kudos
Resolved! Workspace deployed via AWS Marketplace.
Workspace deployed via AWS Marketplace.Internal endpoint 10.53.215.1 exists in VPC butSSL handshake fails. Cannot connect to metastore.Workspace URL: dbc-bb08dd2f-f142.cloud.databricks.comAWS Account: 452456948535Region: us-east-1"
- 301 Views
- 3 replies
- 0 kudos
- 0 kudos
Databricks endpoints present certificates for hostnames like *.cloud.databricks.com (or *.privatelink.cloud.databricks.com when PrivateLink is enabled). If your client connects to https://10.53.215.1 directly, the TLS ClientHello typically lacks the ...
- 0 kudos
- 281 Views
- 1 replies
- 0 kudos
Error creating Git folder: Invalid Git provider credential although PAT is valid
Hi everyone,I’m getting this error when trying to create a Git folder in Databricks:Error creating Git folderInvalid Git provider credential for repository with URL [Placeholder].How to fixPlease go to your remote Git provider to ensure that:You have...
- 281 Views
- 1 replies
- 0 kudos
- 0 kudos
1. Confirm HTTPS URL with .git suffix → no embedded credentials in URL Don't Use Use Insteadgit@github.com:org/repo.git (SSH)https://github.com/org/repo.githttps://github.com/org/repo (no .git)https://github.com/org/repo.githttps://user@github.com/or...
- 0 kudos
- 921 Views
- 9 replies
- 0 kudos
Resolved! Unable to connect to any cluster from a notebook
I'm experiencing an unusual issue following my return from annual leave. I'm unable to connect to any compute from a notebook (both Classic Compute and Serverless) this is despite having Can Manage permissions on the clusters.The error shown is: "Unk...
- 921 Views
- 9 replies
- 0 kudos
- 0 kudos
Hi all, the issue should now be mitigated. Really appreciate your patience on this!Do let us know if you’re still experiencing any problems.
- 0 kudos
- 641 Views
- 4 replies
- 2 kudos
Resolved! What is the best way to use Unity catalog with medallion architecture using ADLS2
Hi,I am using a medallion architecture on Azure Data Lake Storage Gen2 with Azure Databricks. Currently, I am storing data in Parquet format (not Delta tables), and I am planning to implement Unity Catalog (UC).As part of this setup, I understand tha...
- 641 Views
- 4 replies
- 2 kudos
- 2 kudos
I was going to follow 3rd but then it violets our medallion. And we don't have that much data to separate it physically. So going with 1st approach. But Thank you very much @karthickrs, I'll keep this in mind
- 2 kudos
- 262 Views
- 1 replies
- 0 kudos
Resolved! [Unity Catalog] Lack of Credential Type When GCS Interworking in Database ricks in AWS Environment
Hi, I'm using Databricks in AWS environment andI'm trying to link the data from GCP GCS to Unity Catalog.[Official document]I tried to set it up by referring to the official guide of Databricks below.▶ Create service credentials guides[Problem situat...
- 262 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi — this is expected behavior, not a bug. Unity Catalog storage credentials in the UI are cloud-specific to your workspace deployment. Since your workspace runs on AWS, you only see AWS IAM Role and Cloudflare API Token. The GCP Service Account opti...
- 0 kudos
- 499 Views
- 2 replies
- 0 kudos
Can't able to enable Saml SSO
We are plannig to implement the architecturePingFederate is Front IDP -> Microsoft Entra is Actual IDP where we have our Users -> SSO Saml --> Databricks(SP)I have done all the config setups everything is working as expected, the request is Finally ...
- 499 Views
- 2 replies
- 0 kudos
- 0 kudos
This may not be solvable here. Do you have Support via Databricks? You'll need to create a Support Ticket (direct if AWS else MSFT ticket if Azure). If not, ping your Account Team for help. Just saying cos it might require the type of debugging that'...
- 0 kudos
- 533 Views
- 3 replies
- 0 kudos
Resolved! Get resource permissions using terraform
Is there a way to retrieve resources (cluster, job, volume, catalog and all other objects) permissions?On terraform docs there's a resource databricks_permissions but I didn't find a data source databricks_permissions, grants ou similar.How can I get...
- 533 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @fkseki, There isn’t a data "databricks_permissions" (or similar) in the Databricks Terraform, only the databricks_permissions resource, and that resource is authoritative for the full ACL of the object. That means Terraform can’t read the current...
- 0 kudos
- 573 Views
- 2 replies
- 1 kudos
Resolved! Manage MFA
I can't seem to manage my MFA. It just sends me into a loop. Can someone help me look into this please?
- 573 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks for your help, doing this in incognito worked,
- 1 kudos
- 485 Views
- 2 replies
- 0 kudos
AI Runtime Public Preview?
How can I get access to AI runtime? I see that it is in 'Public Preview' but I dont see a feature toggle for it on the account Previews page.https://docs.databricks.com/aws/en/machine-learning/ai-runtime/
- 485 Views
- 2 replies
- 0 kudos
- 0 kudos
The confusion comes from AI Runtime having two components with different preview states.Single-node tasks are already open and do not require any toggle. You can just open a notebook, click the Connect dropdown, select Serverless GPU, choose your acc...
- 0 kudos
- 316 Views
- 2 replies
- 1 kudos
Resolved! Which role is recommended to create and manage Unity Catalog objects—Workspace Admin or Metastore Ad
Which role is recommended to create and manage Unity Catalog objects (catalog, schema, Storage credentials, External Location)—Workspace Admin or Metastore Admin—and why?
- 316 Views
- 2 replies
- 1 kudos
- 1 kudos
I am designing the security model for our Databricks platform and need guidance on role selection for managing Unity Catalog. Which role should be used for creating and managing Unity Catalog objects such as Storage Credentials, External Locations, C...
- 1 kudos
- 503 Views
- 5 replies
- 3 kudos
Spark config ignored in job run settings
i am talking about this setting: i tried so far:spark.executor.cores 8andspark.log.level INFOBoth documented here: https://spark.apache.org/docs/latest/configuration.htmlbut neither i see effect nor i see them set if i check sparkui -> environment ta...
- 503 Views
- 5 replies
- 3 kudos
- 3 kudos
Hi, You're not putting in the wrong place, it's just that Databricks doesn't allow certain configs, because they are managed by Databricks for you. For example your core spark config you've shown above won't be recognised as this is set by selected c...
- 3 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
78 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 54 | |
| 38 | |
| 36 | |
| 25 |