Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
Here's your Data + AI Summit 2024 - Data Governance recap as you navigate the explosion of AI, data and tools in efforts to build a flexible and scalable governance framework that spans your entire data and AI estate.
Keynote: Evolving Data Governan...
and situation?I am currently trying to came up with the way how to deploy Databricks with Terraform in multi-region, multi-tenant environment. I am not talking about simple cases like this (https://docs.databricks.com/data-governance/unity-catalo...
Is it possible to sign DBFS file URLs via the service principal for temporary public access? What I actually want to do is to have integration with Databricks in my app platform. For that purpose, I need to sign DBFS file URLs from my platform (ec2) ...
Signing a file is generally related to what is done by the underlying cloud storage. However, there might be better ways of doing this depending on what you want to share.If you want to share structured data you can save it as a Delta table in the Me...
Starting DBR 12.1 you can run the UNDROP command to undrop a managed or external table. The table must be in Unity Catalog for this feature to work. See https://docs.databricks.com/sql/language-manual/sql-ref-syntax-ddl-undrop-table.html for more det...
We have setup SCIM with Okta at the account-level and setup Unity Catalog and are in the process of migrating groups from workspace-local to account-level. I have an instance profile that was assigned to a workspace-local group. using `databricks_gro...
Retried this using `databricks_group_role` after the `1.210` release of the `databricks/databricks` provider. This worked with an account-level group using the workspace provider and credentials.
We have a problem with users having permissions to create tables inn the legacy hive_metastore. This only happens when using a personal cluster and when table location is not set. The default catalog has been set to main in the workspace, and users ...
I have a system, where the data governed by (A)AD groups. How do I create personal user sandboxes that the user can grant permission to to another user, only if that user that has access to the originating data? I can hear myself that this sounds a b...
Dear Team,Kindly help me to change default EBS volume GP2 type to GP3 type in my Job Compute and All-Purpose Compute. As per documentations, under Instance, EBS Volume configuration is not available, Attached the Snaps for reference RegardsSridhar
Hi @667572
We haven't heard from you since the last response from @Retired_mod ​, and I was checking back to see if her suggestions helped you.
Or else, If you have any solution, please share it with the community, as it can be helpful to others.
A...
From our conversations with Databricks team couple of months ago supporting our account it seems that Databricks has done everything they need at their end for the enablement and is upto Microsoft to prioritize enablement of the same. No ETA has been...
Sorry for cross-posting https://community.databricks.com/t5/get-started-discussions/where-to-find-hive-metastore-thrift-uri-hive-metastore-uri-to/td-p/37064I see that databricks supports connecting the unity catalog using hive metastore API (https://...
This path doesn't exist or we couldn't access it with the credential provided. Metastores require a directory path.Hello Team, Trying to create metasotre, followed the databricks community document encountering above error. Please help me in resolvin...
To resolve the error, you can follow these steps:
Verify the directory path: Ensure that the directory path you provided for the Metastore exists and is accessible. Double-check the path and make sure it is correct.
Check the credentials: Ensure th...
This could be a good starting point: https://docs.databricks.com/data-governance/unity-catalog/index.html?_gl=1*phe9b7*rs_ga*Mzk5NGVlZjYtMTAwOC00MDMxLTkzMmQtMGI0ZDUyYzkzMTVi*rs_ga_PQSEQ3RZQC*MTY4ODQ3MDc4NjYzNC4zLjAuMTY4ODQ3MDc4OC42MC4wLjA.*_gcl_aw*R0...
Hello,I recently tried running R code in a notebook. The notebook is about a year old and was used in a different cluster that no longer exists. I received the message, 'Your administrator has only allowed sql and python commands on this cluster. Thi...
There's a feature in preview available that allows configuring single user access mode clusters that support R, where the assigned single user is a service principal. You can bind multiple users to the service principal to achieve multiple user acces...
I manually added an external location based on a Terraformed S3 bucket. Now I want to Terraform the external location as well, but it complains that I have already register another external location on that bucket (my manual one).I thought I would qu...
@espenol looks you have tables that are active in other catalog's which is being picked from this location. usually your metadata stores in metastore, only external table data gets stored in external location. please check and drop tables/catalog whi...
Yes! They can and should and they should do it now. That data is being aggregated, anonymized, and sold. And the patient doesn't see a dime. So knowing that you have the right of removal and asking to be removed are two major problems.