cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 3891 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Data Governance | Navigate the explosion of AI, data and tools

Here's your Data + AI Summit 2024 - Data Governance recap as you navigate the explosion of AI, data and tools in efforts to build a flexible and scalable governance framework that spans your entire data and AI estate. Keynote: Evolving Data Governan...

Screenshot 2024-07-03 at 9.27.29 AM.png
  • 3891 Views
  • 0 replies
  • 0 kudos
emanueol
by New Contributor III
  • 285 Views
  • 5 replies
  • 0 kudos

Foreign catalog to Snowflake

While learning about Databricks foreign catalogsn (I'm on free tier DBX account), seems theres 2 ways creating foreign catalog to Snowflake:via CONNECTION type=snowflake, seems jdbc connection through where DBX pulls all metadata of 1 snowflake datab...

  • 285 Views
  • 5 replies
  • 0 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 0 kudos

Hi @emanueol, Your follow-up question is clear, and it is a good distinction to make. Let me address it directly. SHORT ANSWER Yes, when you set up Snowflake Catalog Federation in Databricks, Databricks does use the Snowflake Horizon REST Iceberg cat...

  • 0 kudos
4 More Replies
vjussiiih
by New Contributor
  • 243 Views
  • 1 replies
  • 0 kudos

Grant permissions to existing catalogs/schemas using Databricks Asset Bundles

Hi,I’m trying to use Databricks Asset Bundles (DAB) to assign Unity Catalog grants to catalogs and schemas that already exist in my workspace.These catalogs and schemas were not originally created through DAB, but I would now like to manage their gra...

  • 243 Views
  • 1 replies
  • 0 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 0 kudos

Hi @vjussiiih, Let me walk you through this. You are correct that DABs treat schemas and catalogs defined under "resources" as fully managed resources, which means they attempt to create them on first deploy and manage their full lifecycle. This is w...

  • 0 kudos
r_w_
by New Contributor III
  • 96 Views
  • 1 replies
  • 1 kudos

Unity Catalog Data Classification Dashboard: When Does the “User Access” Column Get Populated?

Hi everyone,I’m currently testing the Data Classification feature in Databricks Unity Catalog.On the dashboard after classification completes, there’s a column called **“User Access.”** Based on the description, it seems to show the number of users w...

r_w__1-1770626605315.png
  • 96 Views
  • 1 replies
  • 1 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 1 kudos

Hi @r_w_, Appreciate you sharing the details. the "User Access" column in the data classification results view is one of those features that is not thoroughly covered in the public documentation yet, so I understand the confusion. Here is what I have...

  • 1 kudos
sukhendu2017
by New Contributor
  • 204 Views
  • 3 replies
  • 1 kudos

Databrick unity catalog REST API documentation link and lineage retention period

Hi Team,Hope you guys are doing well. Question 1:I am using this Databrick unity catalog rest api endpoint api/2.0/lineage-tracking/table-lineage?table_name=schemaname.catalogname.tablename&include_entity_lineage=true for getting lineage . Is this of...

Data Governance
unitycatalog
  • 204 Views
  • 3 replies
  • 1 kudos
Latest Reply
sukhendu2017
New Contributor
  • 1 kudos

Thank you so much for response and solution in details.I have one question in Unity Catalog lineage UI showing last 18 months also, but as per official documentation Lineage retains up to last 1 Year. Why this contradicts statement! Also, I am trying...

  • 1 kudos
2 More Replies
bigdatabase
by New Contributor
  • 79 Views
  • 1 replies
  • 0 kudos

Turn off data export functionality

Is there a current way to limit data exporting. Meaning not allowing users export data from a db objects to csv, excel, or copying results to clipboard. I see that currently exporting/downloading data can be done at workspace level, but I am proposin...

  • 79 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ashwin_DSA
Databricks Employee
  • 0 kudos

Hi @bigdatabase, As of today, Databricks lets you control downloading notebook results only at the workspace level. As you pointed out, a workspace admin can disable downloading results from the notebook for all users in that workspace. However, ther...

  • 0 kudos
TanushM
by New Contributor
  • 11016 Views
  • 6 replies
  • 0 kudos

Unity Catalog- Informatica Cloud Data Governance & Catalog Integration

We are attempting to create a comprehensive Data Governance solution using Unity Catalog & INFA CDGC tool. The objective is to onboard the business assets on the Informatica DG platform and use the unity catalog to trace technical assets and lineage ...

  • 11016 Views
  • 6 replies
  • 0 kudos
Latest Reply
abhartiya
New Contributor
  • 0 kudos

@TanushM -- We are also planning to do the same. However we are struggling with the volume of tables inside some schemas in a catalog. Can you suggest some of the best practices you have come across while integrating Unity catalog to CDGC? 

  • 0 kudos
5 More Replies
Erik
by Valued Contributor III
  • 2939 Views
  • 5 replies
  • 4 kudos

Resolved! Will Unity catalog come to Europe?

It is our understanding that enabling unity catalog means that some metadata (for example usernames) will be sent to the USA for processing/storage. This is unfortunately a deal-breaker for us, and we need the data to reside solely in Europe for comp...

  • 2939 Views
  • 5 replies
  • 4 kudos
Latest Reply
youssefmrini
Databricks Employee
  • 4 kudos

Usernames have always been stored centrally in the US. That's similar to most global services for the IAM Layer. The only thing additional that gets stored in the US as part of UC is the metastore names. you can check this link ( https://azure.micros...

  • 4 kudos
4 More Replies
ctech932
by New Contributor
  • 127 Views
  • 2 replies
  • 1 kudos

Resolved! Databricks autoloader with manual file delete?

While we evaluate moving our many autoloader configurations to use `cloudFiles.cleanSource` , we're wondering if we can instead just implement a lifecycle policy outside of Databricks that deletes files older than 30 days.Is there a problem with doin...

  • 127 Views
  • 2 replies
  • 1 kudos
Latest Reply
saurabh18cs
Honored Contributor III
  • 1 kudos

Hi  why are you not planning to move away from directory listing mode to useManagedFileEvents?? execution will be faster and no more scanning of directories everytime.  File events use a single Azure Databricks-managed file notification queue for all...

  • 1 kudos
1 More Replies
APJESK
by Contributor
  • 138 Views
  • 1 replies
  • 0 kudos

Resolved! Unity catalog management

How is Unity Catalog managed in real time at an enterprise scale, including workspace-level restrictions, privilege-based ACLs, row- and column-level security, ABAC, and tag-driven governance, and which languages or tools are used to manage the entir...

  • 138 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @APJESK ,The most common approach I've seen in enterprise is to use terraform to govern Unity Catalog. Below you can find a good series of articles that introduce this concept:https://pl.seequality.net/terra-dbx-p1/Databricks terraform provider is...

  • 0 kudos
ADBricksExplore
by New Contributor II
  • 243 Views
  • 2 replies
  • 3 kudos

child Subqueries/sub-statements history metrics, from a parent [CALL...] statement in QueryHistory

Hi,I cannot find so far a way to get programmatically (SQL/Python) the Subqueries(/Sub-statements) executions history records, shown in ADBricks UI Query History/Profile, that were executed during a TaskRun of Job, as shown in [red boxes] on the atta...

image.png
  • 243 Views
  • 2 replies
  • 3 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 3 kudos

Greetings @ADBricksExplore ,  Short answer: there isn’t a supported public API that returns the “Substatements / Subqueries” panel you see in the Query History or Profile UI. The GraphQL endpoints the UI relies on are internal and not stable or suppo...

  • 3 kudos
1 More Replies
andreos
by New Contributor
  • 2557 Views
  • 2 replies
  • 1 kudos

Manage serverless budget policy permission via API

Hi everyone,I'm using the Budget Policy API (https://docs.databricks.com/api/account/budgetpolicy/create) to create Serverless budget policies. I can successfully create and retrieve policies, but I haven’t found any way to manage their permissions —...

Data Governance
Budget policies
  • 2557 Views
  • 2 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Here are some helpful hints/tips/tricks: Programmatic Management of Budget Policy Permissions: Options and Best Practices 1. What is Possible Today? Yes, there is a programmatic way to manage permissions (user and group assignments) for Databricks Bu...

  • 1 kudos
1 More Replies
discuss_darende
by New Contributor II
  • 305 Views
  • 2 replies
  • 1 kudos

Resolved! How can I get workspace groups and their users via a table — and also from a Databricks App?

I’m trying to get a full list of Databricks workspace groups and their user memberships. I want to do this in two ways:As a queryable table or view (e.g., for audits, security reviews, app integration)From within a Databricks App (Streamlit-style), u...

  • 305 Views
  • 2 replies
  • 1 kudos
Latest Reply
Raman_Unifeye
Honored Contributor III
  • 1 kudos

@discuss_darende - you could use below code in the notebook.Pls adjust it based on your need.from databricks.sdk import AccountClient, WorkspaceClient # If env vars are set, this picks them up automatically a = WorkspaceClient() # List identities u...

  • 1 kudos
1 More Replies
ikhwan
by New Contributor II
  • 291 Views
  • 2 replies
  • 1 kudos

INVALID_PARAMETER_VALUE: DBR version 16.4 or higher is required to query table

We are encountering issue lately around 17 January 2026.We make a Pipeline using DLT and RLS for filtering rows based on user's email. Previously our Row Level Security (RLS) is working fine. Until 3 days ago the dashboard is got into problem said:  ...

Data Governance
abac
Databricks Runtime
DBR
Row Level Security
  • 291 Views
  • 2 replies
  • 1 kudos
Latest Reply
ikhwan
New Contributor II
  • 1 kudos

I have been trying to create a serverless workspace using us-east-1 and the RLS working fine. I think it's confirmed that only happened in specific region (in my case ap-southeast-1)

  • 1 kudos
1 More Replies
cobba16
by New Contributor II
  • 301 Views
  • 1 replies
  • 1 kudos

Hive metastore CANNOT access storage

I'm new to Azure Databricks and I'm facing an issue when trying to create a schema or table that points to my Azure Storage account. I keep getting this error:```[EXTERNAL_METASTORE_CLIENT_ERROR.OPERATION_FAILED] Client operation failed: org.apache.h...

  • 301 Views
  • 1 replies
  • 1 kudos
Latest Reply
Commitchell
Databricks Employee
  • 1 kudos

Hi Cobba16, A couple thoughts for you. It looks like you're setting the credentials in a spark config from within a Notebook. That's why operations execute in your session context, but when trying to create the Schema, the Hive Metastore is trying to...

  • 1 kudos
Harikrish
by New Contributor III
  • 34848 Views
  • 3 replies
  • 2 kudos

Resolved! Data Governance

If I grant all privileges in my schema does that automatically give access to users for all underlying objects? Or should I give access seperately for all the objects?

  • 34848 Views
  • 3 replies
  • 2 kudos
Latest Reply
SHampton
New Contributor II
  • 2 kudos

Are there Houston local practitioners interested in building up the data gov (strategy, implementation, value add) group? #unitycatalog #governedtags #systemtags #abac #principals #groups

  • 2 kudos
2 More Replies
Labels