cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Governance

Forum Posts

GlenMacLarty
by New Contributor III
  • 4048 Views
  • 2 replies
  • 1 kudos

Resolved! Security Analysis Tool (SAT) on GCP - OSError: [Errno 5] Input/output error

I am interested to hear from anyone who has setup the Security Analysis Tool (SAT) on a GCP hosted Databricks environment.I am in the process of getting the tool setup and I'm experiencing issues running the security_analysis_initializer notebook. Th...

  • 4048 Views
  • 2 replies
  • 1 kudos
Latest Reply
GlenMacLarty
New Contributor III
  • 1 kudos

Thanks @Kaniz_Fatma,I have been able to get past this error through recreating the cluster with absolute barebone config. It was potentially a custom configuration (unknown at this time) which was causing this to fail. I will try and reproduce once I...

  • 1 kudos
1 More Replies
vmpmreistad
by New Contributor II
  • 3618 Views
  • 1 replies
  • 0 kudos

CREATE OR REPLACE VIEW removes permissions [Unity Catalog]

When I run CREATE OR REPLACE VIEW on an existing view in Unity Catalog, the grants that were made on that object are removed. This seems like it is a bug. Is it on purpose or not?How to replicate:1. Create the viewRun the create or replace statement:...

Data Governance
Unity Catalog
  • 3618 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @vmpmreistad, It appears that the issue you're facing is a known behaviour in Databricks when you execute a CREATE OR REPLACE VIEW statement on an existing view. This action overwrites the existing view definition, including any previously granted...

  • 0 kudos
Amber-26
by New Contributor III
  • 6294 Views
  • 5 replies
  • 2 kudos

Send System Table Logs to ADLS

Hello,I am working with Unity Catalogue in Azure Databricks. I have enabled the system schemas for my workspace but unable to figure out a way to send these system table logs to ADLS which I have mounted using Azure Databricks Connector. Can someone ...

  • 6294 Views
  • 5 replies
  • 2 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 2 kudos

@Amber-26 you can try this approach, also if you want to have graphical representation of everything you can use lakehouse monitoring feature, where after enabling these tables, you can consume them into dashboards and run them and analyze them

  • 2 kudos
4 More Replies
MattG1
by New Contributor II
  • 5403 Views
  • 4 replies
  • 2 kudos

Running Stored Procedures on a Multi-node/ Shared Cluster

Hi,we are trying to move some of our code from a ‘legacy’ cluster to a ‘Multi-node/ Shared' cluster so that we can start using Unity Catalog. However, we have run into an issue with some of our code, which calls Stored Procedures, on the new cluster....

image image image
  • 5403 Views
  • 4 replies
  • 2 kudos
Latest Reply
TjommeV-Vlaio
New Contributor II
  • 2 kudos

We face the same issue. Our code depends heavily on stored procedures (best practice, no).Unfortunately, when moving to shared clusters and DBR 13.3, this does not work anymore.driver_manager = spark._sc._gateway.jvm.java.sql.DriverManager connection...

  • 2 kudos
3 More Replies
Yahya24
by New Contributor III
  • 4027 Views
  • 2 replies
  • 0 kudos

DBFS not accessible

Hello,I activated the "Table access control" option and changed the cluster access mode to Shared with the aim of giving access rights to the tables without using the unity catalog.Since this change I can't access dbfs files with python: 

Yahya24_0-1697119731558.png Yahya24_1-1697119766776.png
  • 4027 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Yahya24 ,  When you change the cluster access mode to "Shared" on Databricks, the cluster is associated with a Databricks-managed IAM role that is used to access AWS resources. This role might not have the necessary permissions to access DBFS res...

  • 0 kudos
1 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 3539 Views
  • 0 replies
  • 1 kudos

Read only access to catalog

Databricks users can now easily set their data catalog to read-only mode. Just click "Manage Access Level" and choose "Change access to read only".

binding.png
  • 3539 Views
  • 0 replies
  • 1 kudos
isabelgontijo
by New Contributor
  • 6381 Views
  • 1 replies
  • 0 kudos

View column comments

As a way to minimize storage costs, my team and I want to create views instead of tables in the Gold layer.We always try to improve the experience of our users by adding comments to the columns. the problem is that views do not inherit comments from ...

  • 6381 Views
  • 1 replies
  • 0 kudos
Latest Reply
luck_az
New Contributor III
  • 0 kudos

Hi @isabelgontijo ,had you found any workaround this?You can use create view statement as mentioned belowIn addition to this,i am  not able to add comment on column which is encrypted and in views we are decrypting that column. I am using create view...

  • 0 kudos
ajbush
by New Contributor III
  • 7418 Views
  • 1 replies
  • 0 kudos

Unity Catalog permissions approach for strict controls when creating tables

Hi all,I'm deploying Unity Catalog into a large enterprise with quite strict controls. I want to give my users some autonomy over creating tables from upstream datasets they have select access on, but also restrict these controls. I've been through a...

  • 7418 Views
  • 1 replies
  • 0 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 0 kudos

A schema contains tables, views, and functions. You create schemas inside catalogs .RequirementsYou must have the USE CATALOG and CREATE SCHEMA data permissions on the schema’s parent catalog. Either a metastore admin or the owner of the catalog can ...

  • 0 kudos
FreshPork
by New Contributor II
  • 5833 Views
  • 2 replies
  • 2 kudos

Resolved! Data Lineage in Unity Catalog on GCP

Hi, Is it possible to track and display data lineage with Unity Catalog while using databricks on GCP?  like so: https://docs.databricks.com/en/data-governance/unity-catalog/data-lineage.htmlIf it's not yet implemented, then is there any roadmap avai...

Data Governance
Data Lineage
GCP
Unity Catalog
  • 5833 Views
  • 2 replies
  • 2 kudos
Latest Reply
FreshPork
New Contributor II
  • 2 kudos

Thank you, That's great news! 

  • 2 kudos
1 More Replies
TienDat
by New Contributor III
  • 10566 Views
  • 3 replies
  • 2 kudos

Resolved! Unity Catalog: Expose REST API for Table Insights

Dear community,Our company is using Databricks and we are happy to have Unity Catalog emerged to solve part of our Data Governance problems. We are very interested in the Table Insights feature which is newly introduced (View frequent queries and use...

  • 10566 Views
  • 3 replies
  • 2 kudos
Latest Reply
TienDat
New Contributor III
  • 2 kudos

Dear @Kaniz_Fatma, Thanks for your answer. I just checked and indeed such information can be queried to information_schema tables.Anyways, do you have the information if there exists a plan to expose such insights via REST API, or not at all?The reas...

  • 2 kudos
2 More Replies
fuselessmatt
by Contributor
  • 3956 Views
  • 2 replies
  • 1 kudos

Resolved! Is it possible to manage access for legacy catalogs (hive_metastore) in Terraform?

We have been successfully managing access for our unity catalogs using the databricks_grant resources in Terraform. Now we want to enable the Rudderstack integration for Databricks, but that does not support unity catalog and instead put files inside...

  • 3956 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Mattias P​ :Unfortunately, it is not currently possible to manage access to the Hive Metastore catalog (or other external metastores) using the databricks_grant resource in Terraform. This is because the databricks_grant resource is specifically des...

  • 1 kudos
1 More Replies
drii_cavalcanti
by New Contributor III
  • 2221 Views
  • 1 replies
  • 0 kudos

Execute Endpoint API with DLT

Hi,I was wondering if it would be possible to execute a dlt command such as @dlt.create_table against the execute endpoint API and then materialize the asset?Thank you,Adriana Cavalcanti

  • 2221 Views
  • 1 replies
  • 0 kudos
Latest Reply
drii_cavalcanti
New Contributor III
  • 0 kudos

Ref.: The Endpoint API that I am referring to is: https://docs.databricks.com/api/workspace/commandexecution/execute

  • 0 kudos
paniz_asghari
by New Contributor
  • 2211 Views
  • 1 replies
  • 0 kudos

SparkException: There is no Credential Scope.

Hi I am new to databricks and trying to connect to Rstudio Server from my all-purpose compute cluster.Here are the cluster configuration:Policy: Personal ComputeAccess mode: Single userDatabricks run time version: 13.2 ML (includes Apache Spark 3.4.0...

  • 2211 Views
  • 1 replies
  • 0 kudos
Latest Reply
kunalmishra9
New Contributor III
  • 0 kudos

Running into this issue as well. Let me know if you found a resolution, @paniz_asghari!  

  • 0 kudos
carlafernandez
by New Contributor III
  • 30819 Views
  • 10 replies
  • 6 kudos

Resolved! Databricks-connect version 13.0.0 throws Exception with details = "Missing required field 'UserContext' in the request."

I'm trying to connect to a cluster with Runtime 13.0 and Unity Catalog through databricks-connect version 13.0.0 (for Python). The spark session seems to initialize correctly but anytime I try to use it, I get the following error:{SparkConnectGrpcExc...

  • 30819 Views
  • 10 replies
  • 6 kudos
Latest Reply
redperiabras
New Contributor II
  • 6 kudos

I have the same error up to DBR 13.3 LTS. When I upgraded it to 14.0, I was then able to connect with my databricks compute from my local environment.

  • 6 kudos
9 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!

Labels