cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Governance

Forum Posts

jv_v
by New Contributor III
  • 1736 Views
  • 2 replies
  • 0 kudos

Resolved! Understanding the Use of a Specific Terraform Block in Unity Catalog Automation

I am currently working on automating Unity Catalog (UC) using Terraform, and I came across the following Terraform block:  resource "databricks_metastore_data_access" "first" {  provider = databricks.Workspace  metastore_id = databricks_metastore.thi...

  • 1736 Views
  • 2 replies
  • 0 kudos
Latest Reply
jv_v
New Contributor III
  • 0 kudos

I implemented the following Terraform code for configuring a Databricks metastore data access:terraform {required_providers {azurerm = {source = "hashicorp/azurerm"}databricks = {source = "databricks/databricks"}}}provider "azurerm"{alias = "azure"sk...

  • 0 kudos
1 More Replies
ArjunGopinath96
by New Contributor
  • 658 Views
  • 1 replies
  • 0 kudos

Change Data Feed Cost

Greetings,I want to understand the efficiency of using Change Data Feed in tracking the changes of a table that has around 1 million rows. There will be around 20K appends in a week.  I read that to track appends CDF is not the right way-if thats tru...

  • 658 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ravivarma
New Contributor III
  • 0 kudos

Hello @ArjunGopinath96 , Greetings! Change Data Feed (CDF) in Delta Lake provides an efficient way to track changes in a table, including appends. It works by recording row-level changes between versions of a Delta table, capturing both the row data ...

  • 0 kudos
jv_v
by New Contributor III
  • 1425 Views
  • 2 replies
  • 1 kudos

Resolved! Issue Creating Metastore Using Terraform with Service Principal Authentication

I'm encountering an issue when attempting to create a metastore using Terraform with service principal authentication. Below is the error message I receive:Error:"module.metastore_and_users.databricks_metastore.this: error: cannot create metastore: d...

  • 1425 Views
  • 2 replies
  • 1 kudos
Latest Reply
jacovangelder
Contributor III
  • 1 kudos

You need to add the provider alias to the databricks_metastore resource, i.e.: resource "databricks_metastore" "this" { provider = databricks.azure_account name = var.metastore_name storage_root = format("abfss://%s@%s.dfs.core.windows.net/", azurerm...

  • 1 kudos
1 More Replies
kaizad
by New Contributor
  • 237 Views
  • 1 replies
  • 0 kudos

Owner can't sign into account after enabling SSO

Hi all, I recently enabled SSO on my Databricks account. Now, when a user signs in they see "No workspaces have been enabled for your account", which is the expected behavior as I haven't created any workspaces yet. However, when I try to sign in wit...

kaizad_0-1718712452338.png
  • 237 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @kaizad, Thank you for sharing your concern with us!   To expedite your request, please list your concerns on our ticketing portal. Our support staff would be able to act faster on the resolution (our standard resolution time is 24-48 hours).  

  • 0 kudos
jv_v
by New Contributor III
  • 1028 Views
  • 3 replies
  • 0 kudos

Authenticating to Accounts Console Using Client ID and Secret via Terraform and Databricks CLI

I am currently working on a project where I need to authenticate to the Databricks accounts console from Terraform using a client ID and client secret. Here is the relevant portion of my Terraform configuration:// Provider for Databricks accountprovi...

jv_v_0-1718116490905.png
  • 1028 Views
  • 3 replies
  • 0 kudos
Latest Reply
jw-dbx
New Contributor III
  • 0 kudos

First, run following commands in shell, please replace placeholder according to your environment: export CLIENT_ID=<client id> export CLIENT_SECRET=<client secret> export TOKEN_EP=https://accounts.cloud.databricks.com/oidc/accounts/<databricks accoun...

  • 0 kudos
2 More Replies
shahname
by New Contributor II
  • 1266 Views
  • 1 replies
  • 1 kudos

Resolved! region specific issue in Unity catalog

Hello,I am using Unity catalog on my databricks and all of my data resides in data lake which is located in west europe.I have to onboard the Korea data to UC which is present on data lake pointing to south korea regionMy question is do I need to set...

  • 1266 Views
  • 1 replies
  • 1 kudos
Latest Reply
jacovangelder
Contributor III
  • 1 kudos

I think what you're asking is if you need a new metastore for your Korea data. The technical answer is no. You can just onboard the Korean storage account as an external location in your west europe based Metastore. However you can't onboard Databric...

  • 1 kudos
RicksDB
by Contributor II
  • 745 Views
  • 1 replies
  • 1 kudos

Semantic-link on Databricks

Hello,Is there any plan to support the equivalent of fabric semantic-link on Databricks? Essentially, having the ability to query PowerBi Dataset using SQL on interactive/jobs cluster and SQL warehouse (i.e a Unity Catalog federated source being Powe...

  • 745 Views
  • 1 replies
  • 1 kudos
Latest Reply
cassiebratt
New Contributor II
  • 1 kudos

@RicksDBMaryKayInTouch wrote:Hello,Is there any plan to support the equivalent of fabric semantic-link on Databricks? Essentially, having the ability to query PowerBi Dataset using SQL on interactive/jobs cluster and SQL warehouse (i.e a Unity Catalo...

  • 1 kudos
Christine
by Contributor II
  • 47116 Views
  • 17 replies
  • 16 kudos

Resolved! Cannot use RDD and cannot set "spark.databricks.pyspark.enablePy4JSecurity false" for cluster

I have been using "rdd.flatMap(lambda x:x)" for a while to create lists from columns however after I have changed the cluster to a Shared acess mode (to use unity catalog) I get the following error: py4j.security.Py4JSecurityException: Method public ...

  • 47116 Views
  • 17 replies
  • 16 kudos
Latest Reply
rahuja
New Contributor III
  • 16 kudos

In my case the problem was that we were trying to use SparkXGBoostRegressor and in the docs it says that it does not work on clusters with autoscaling enabled. So we just disabled autoscaling for the interactive cluster where we were testing the mode...

  • 16 kudos
16 More Replies
OmkarMehta
by New Contributor
  • 4293 Views
  • 1 replies
  • 0 kudos

Delta Sharing

Can delta sharing have row, column-level acls? 

  • 4293 Views
  • 1 replies
  • 0 kudos
Latest Reply
jw-dbx
New Contributor III
  • 0 kudos

You can filter row/column using dynamic view along with delta sharing. Here is the documentation: https://docs.databricks.com/en/data-sharing/create-share.html#add-dynamic-views-to-a-share-to-filter-rows-and-columns 

  • 0 kudos
Swethag
by New Contributor II
  • 798 Views
  • 1 replies
  • 1 kudos

Resolved! Best practices for setting up the user groups in Databricks

is there any documentation on the best practices for setting up the user groups in azure databricks?

  • 798 Views
  • 1 replies
  • 1 kudos
Latest Reply
jw-dbx
New Contributor III
  • 1 kudos

We recommend to use identity federation for user groups setup. You can refer the following documentation for details: https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/best-practices 

  • 1 kudos
Data_Analytics1
by Contributor III
  • 22989 Views
  • 14 replies
  • 3 kudos

File not found error.

FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/FileStore/config.share'When I am trying to read the config.share file, it is throwing this error. I tried with spark path format as well which is dbfs:/FileStore/config.share' but it also...

  • 22989 Views
  • 14 replies
  • 3 kudos
Latest Reply
jacovangelder
Contributor III
  • 3 kudos

On Unity Catalog Shared Access Mode clusters you need to use a UC Volume to read (config) files using vanilla Python (with open() for example that many libs use). You can no longer read files from DBFS this way. This is all part of the new security m...

  • 3 kudos
13 More Replies
AntonioR
by New Contributor II
  • 833 Views
  • 0 replies
  • 0 kudos

Initializing Session Variables for use in row filtering/column masking in Every Session

I just got through reading the great technical blog post from @SergeRielau  about session variableshttps://community.databricks.com/t5/technical-blog/sql-session-variables-stash-your-state-and-use-it-too/bc-p/72453#M193Super useful facility and well ...

  • 833 Views
  • 0 replies
  • 0 kudos
eduardo_marin_n
by New Contributor
  • 331 Views
  • 1 replies
  • 0 kudos

Cannot find SOC compliance report

Hi,I am trying to obtain SOC compliance documents for my company. Despite following the instructions of the AI assistant, I can't find anywhere to submit a support ticket to get these documents. Has anyone had experience with this issue in the past w...

  • 331 Views
  • 1 replies
  • 0 kudos
Latest Reply
jw-dbx
New Contributor III
  • 0 kudos

You can contact your Databricks account executive (sales representative), they should be able to get you the copy of SOC compliance reports. 

  • 0 kudos
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels