cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 4040 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Data Governance | Navigate the explosion of AI, data and tools

Here's your Data + AI Summit 2024 - Data Governance recap as you navigate the explosion of AI, data and tools in efforts to build a flexible and scalable governance framework that spans your entire data and AI estate. Keynote: Evolving Data Governan...

Screenshot 2024-07-03 at 9.27.29 AM.png
  • 4040 Views
  • 0 replies
  • 0 kudos
MadelynM
by Databricks Employee
  • 4040 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Data Governance | Navigate the explosion of AI, data and tools

Here's your Data + AI Summit 2024 - Data Governance recap as you navigate the explosion of AI, data and tools in efforts to build a flexible and scalable governance framework that spans your entire data and AI estate. Keynote: Evolving Data Governan...

Screenshot 2024-07-03 at 9.27.29 AM.png
  • 4040 Views
  • 0 replies
  • 0 kudos
rjsilva1987
by New Contributor
  • 1596 Views
  • 0 replies
  • 0 kudos

Lineage on-premise DB2/LUW or Z/os

We've an on-premises DB2/LUW and Z/Os database and we need to show a lineage of them. Can Unity Catalog or Purview do this? If we've the best, what would it be? Or would working with both be a better option?

  • 1596 Views
  • 0 replies
  • 0 kudos
jv_v
by Databricks Partner
  • 4646 Views
  • 3 replies
  • 0 kudos

Issue with Creating External Location Using Service Principal in Terraform

I'm facing an issue while trying to create an external location in Databricks using Terraform and a service principal. The specific error message I'm encountering is:Error:Here's some context:Using Azure CLI (Az login): The creation of the external l...

jv_v_0-1719497498809.png
  • 4646 Views
  • 3 replies
  • 0 kudos
Latest Reply
jacovangelder
Databricks MVP
  • 0 kudos

After creating the databricks_metastore resource, can you run databricks_grants? like this  resource "databricks_grants" "foo" { depends_on = databricks_metastore.foo metastore = databricks_metastore.foo.id grant { principal = <your service ...

  • 0 kudos
2 More Replies
jv_v
by Databricks Partner
  • 3764 Views
  • 1 replies
  • 0 kudos

Failing Cluster Creation

I'm encountering an issue with my Terraform code for creating a cluster. The terraform plan command runs successfully and shows the correct changes, but the after that fails with errors. Here are the details: Terraform Code:terraform {required_provid...

jv_v_0-1719500677324.png jv_v_1-1719500696573.png
  • 3764 Views
  • 1 replies
  • 0 kudos
Latest Reply
jacovangelder
Databricks MVP
  • 0 kudos

Are you getting two different errors?default auth error usually means you you need to explicitly set the providers in either the data or resource objects as well, or you're missing a depends_on attribute. I think for both cases it is the latter. i.e....

  • 0 kudos
jv_v
by Databricks Partner
  • 4877 Views
  • 2 replies
  • 0 kudos

Resolved! Understanding the Use of a Specific Terraform Block in Unity Catalog Automation

I am currently working on automating Unity Catalog (UC) using Terraform, and I came across the following Terraform block:  resource "databricks_metastore_data_access" "first" {  provider = databricks.Workspace  metastore_id = databricks_metastore.thi...

  • 4877 Views
  • 2 replies
  • 0 kudos
Latest Reply
jv_v
Databricks Partner
  • 0 kudos

I implemented the following Terraform code for configuring a Databricks metastore data access:terraform {required_providers {azurerm = {source = "hashicorp/azurerm"}databricks = {source = "databricks/databricks"}}}provider "azurerm"{alias = "azure"sk...

  • 0 kudos
1 More Replies
ArjunGopinath96
by New Contributor
  • 6936 Views
  • 1 replies
  • 1 kudos

Change Data Feed Cost

Greetings,I want to understand the efficiency of using Change Data Feed in tracking the changes of a table that has around 1 million rows. There will be around 20K appends in a week.  I read that to track appends CDF is not the right way-if thats tru...

  • 6936 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ravivarma
Databricks Employee
  • 1 kudos

Hello @ArjunGopinath96 , Greetings! Change Data Feed (CDF) in Delta Lake provides an efficient way to track changes in a table, including appends. It works by recording row-level changes between versions of a Delta table, capturing both the row data ...

  • 1 kudos
jv_v
by Databricks Partner
  • 5022 Views
  • 2 replies
  • 1 kudos

Resolved! Issue Creating Metastore Using Terraform with Service Principal Authentication

I'm encountering an issue when attempting to create a metastore using Terraform with service principal authentication. Below is the error message I receive:Error:"module.metastore_and_users.databricks_metastore.this: error: cannot create metastore: d...

  • 5022 Views
  • 2 replies
  • 1 kudos
Latest Reply
jacovangelder
Databricks MVP
  • 1 kudos

You need to add the provider alias to the databricks_metastore resource, i.e.: resource "databricks_metastore" "this" { provider = databricks.azure_account name = var.metastore_name storage_root = format("abfss://%s@%s.dfs.core.windows.net/", azurerm...

  • 1 kudos
1 More Replies
jv_v
by Databricks Partner
  • 8499 Views
  • 3 replies
  • 0 kudos

Authenticating to Accounts Console Using Client ID and Secret via Terraform and Databricks CLI

I am currently working on a project where I need to authenticate to the Databricks accounts console from Terraform using a client ID and client secret. Here is the relevant portion of my Terraform configuration:// Provider for Databricks accountprovi...

jv_v_0-1718116490905.png
  • 8499 Views
  • 3 replies
  • 0 kudos
Latest Reply
JianWu
Databricks Employee
  • 0 kudos

First, run following commands in shell, please replace placeholder according to your environment: export CLIENT_ID=<client id> export CLIENT_SECRET=<client secret> export TOKEN_EP=https://accounts.cloud.databricks.com/oidc/accounts/<databricks accoun...

  • 0 kudos
2 More Replies
shahname
by New Contributor II
  • 2961 Views
  • 1 replies
  • 1 kudos

Resolved! region specific issue in Unity catalog

Hello,I am using Unity catalog on my databricks and all of my data resides in data lake which is located in west europe.I have to onboard the Korea data to UC which is present on data lake pointing to south korea regionMy question is do I need to set...

  • 2961 Views
  • 1 replies
  • 1 kudos
Latest Reply
jacovangelder
Databricks MVP
  • 1 kudos

I think what you're asking is if you need a new metastore for your Korea data. The technical answer is no. You can just onboard the Korean storage account as an external location in your west europe based Metastore. However you can't onboard Databric...

  • 1 kudos
kaizad
by New Contributor
  • 1031 Views
  • 0 replies
  • 0 kudos

Owner can't sign into account after enabling SSO

Hi all, I recently enabled SSO on my Databricks account. Now, when a user signs in they see "No workspaces have been enabled for your account", which is the expected behavior as I haven't created any workspaces yet. However, when I try to sign in wit...

kaizad_0-1718712452338.png
  • 1031 Views
  • 0 replies
  • 0 kudos
OmkarMehta
by New Contributor
  • 6708 Views
  • 1 replies
  • 0 kudos

Delta Sharing

Can delta sharing have row, column-level acls? 

  • 6708 Views
  • 1 replies
  • 0 kudos
Latest Reply
JianWu
Databricks Employee
  • 0 kudos

You can filter row/column using dynamic view along with delta sharing. Here is the documentation: https://docs.databricks.com/en/data-sharing/create-share.html#add-dynamic-views-to-a-share-to-filter-rows-and-columns 

  • 0 kudos
Swethag
by New Contributor II
  • 2819 Views
  • 1 replies
  • 1 kudos

Resolved! Best practices for setting up the user groups in Databricks

is there any documentation on the best practices for setting up the user groups in azure databricks?

  • 2819 Views
  • 1 replies
  • 1 kudos
Latest Reply
JianWu
Databricks Employee
  • 1 kudos

We recommend to use identity federation for user groups setup. You can refer the following documentation for details: https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/best-practices 

  • 1 kudos
Data_Analytics1
by Contributor III
  • 35618 Views
  • 10 replies
  • 2 kudos

File not found error.

FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/FileStore/config.share'When I am trying to read the config.share file, it is throwing this error. I tried with spark path format as well which is dbfs:/FileStore/config.share' but it also...

  • 35618 Views
  • 10 replies
  • 2 kudos
Latest Reply
jacovangelder
Databricks MVP
  • 2 kudos

On Unity Catalog Shared Access Mode clusters you need to use a UC Volume to read (config) files using vanilla Python (with open() for example that many libs use). You can no longer read files from DBFS this way. This is all part of the new security m...

  • 2 kudos
9 More Replies
Labels