cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 2101 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Data Governance | Navigate the explosion of AI, data and tools

Here's your Data + AI Summit 2024 - Data Governance recap as you navigate the explosion of AI, data and tools in efforts to build a flexible and scalable governance framework that spans your entire data and AI estate. Keynote: Evolving Data Governan...

Screenshot 2024-07-03 at 9.27.29 AM.png
  • 2101 Views
  • 0 replies
  • 0 kudos
Amber-26
by New Contributor III
  • 8008 Views
  • 5 replies
  • 2 kudos

Send System Table Logs to ADLS

Hello,I am working with Unity Catalogue in Azure Databricks. I have enabled the system schemas for my workspace but unable to figure out a way to send these system table logs to ADLS which I have mounted using Azure Databricks Connector. Can someone ...

  • 8008 Views
  • 5 replies
  • 2 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 2 kudos

@Amber-26 you can try this approach, also if you want to have graphical representation of everything you can use lakehouse monitoring feature, where after enabling these tables, you can consume them into dashboards and run them and analyze them

  • 2 kudos
4 More Replies
Yahya24
by New Contributor III
  • 8702 Views
  • 1 replies
  • 1 kudos

DBFS not accessible

Hello,I activated the "Table access control" option and changed the cluster access mode to Shared with the aim of giving access rights to the tables without using the unity catalog.Since this change I can't access dbfs files with python: 

Yahya24_0-1697119731558.png Yahya24_1-1697119766776.png
  • 8702 Views
  • 1 replies
  • 1 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 7386 Views
  • 0 replies
  • 1 kudos

Read only access to catalog

Databricks users can now easily set their data catalog to read-only mode. Just click "Manage Access Level" and choose "Change access to read only".

binding.png
  • 7386 Views
  • 0 replies
  • 1 kudos
isabelgontijo
by New Contributor II
  • 8108 Views
  • 1 replies
  • 0 kudos

View column comments

As a way to minimize storage costs, my team and I want to create views instead of tables in the Gold layer.We always try to improve the experience of our users by adding comments to the columns. the problem is that views do not inherit comments from ...

  • 8108 Views
  • 1 replies
  • 0 kudos
Latest Reply
luck_az
New Contributor III
  • 0 kudos

Hi @isabelgontijo ,had you found any workaround this?You can use create view statement as mentioned belowIn addition to this,i am  not able to add comment on column which is encrypted and in views we are decrypting that column. I am using create view...

  • 0 kudos
fuselessmatt
by Contributor
  • 5387 Views
  • 2 replies
  • 1 kudos

Resolved! Is it possible to manage access for legacy catalogs (hive_metastore) in Terraform?

We have been successfully managing access for our unity catalogs using the databricks_grant resources in Terraform. Now we want to enable the Rudderstack integration for Databricks, but that does not support unity catalog and instead put files inside...

  • 5387 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Mattias P​ :Unfortunately, it is not currently possible to manage access to the Hive Metastore catalog (or other external metastores) using the databricks_grant resource in Terraform. This is because the databricks_grant resource is specifically des...

  • 1 kudos
1 More Replies
drii_cavalcanti
by New Contributor III
  • 2664 Views
  • 1 replies
  • 0 kudos

Execute Endpoint API with DLT

Hi,I was wondering if it would be possible to execute a dlt command such as @dlt.create_table against the execute endpoint API and then materialize the asset?Thank you,Adriana Cavalcanti

  • 2664 Views
  • 1 replies
  • 0 kudos
Latest Reply
drii_cavalcanti
New Contributor III
  • 0 kudos

Ref.: The Endpoint API that I am referring to is: https://docs.databricks.com/api/workspace/commandexecution/execute

  • 0 kudos
paniz_asghari
by New Contributor
  • 2776 Views
  • 1 replies
  • 0 kudos

SparkException: There is no Credential Scope.

Hi I am new to databricks and trying to connect to Rstudio Server from my all-purpose compute cluster.Here are the cluster configuration:Policy: Personal ComputeAccess mode: Single userDatabricks run time version: 13.2 ML (includes Apache Spark 3.4.0...

  • 2776 Views
  • 1 replies
  • 0 kudos
Latest Reply
kunalmishra9
New Contributor III
  • 0 kudos

Running into this issue as well. Let me know if you found a resolution, @paniz_asghari!  

  • 0 kudos
carlafernandez
by New Contributor III
  • 44296 Views
  • 10 replies
  • 6 kudos

Resolved! Databricks-connect version 13.0.0 throws Exception with details = "Missing required field 'UserContext' in the request."

I'm trying to connect to a cluster with Runtime 13.0 and Unity Catalog through databricks-connect version 13.0.0 (for Python). The spark session seems to initialize correctly but anytime I try to use it, I get the following error:{SparkConnectGrpcExc...

  • 44296 Views
  • 10 replies
  • 6 kudos
Latest Reply
redperiabras
New Contributor II
  • 6 kudos

I have the same error up to DBR 13.3 LTS. When I upgraded it to 14.0, I was then able to connect with my databricks compute from my local environment.

  • 6 kudos
9 More Replies
drii_cavalcanti
by New Contributor III
  • 1792 Views
  • 0 replies
  • 0 kudos

Permission on Hive Metastore DBX 10.4

I've been working on creating a schema in the Hive Metastore using the following command:spark.sql('CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for me...

Data Governance
clusters
hive_metastore
legacy
permission
  • 1792 Views
  • 0 replies
  • 0 kudos
drii_cavalcanti
by New Contributor III
  • 1313 Views
  • 0 replies
  • 0 kudos

Hive Metastore permission on DBX 10.4 Cluster

I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...

  • 1313 Views
  • 0 replies
  • 0 kudos
drii_cavalcanti
by New Contributor III
  • 1122 Views
  • 0 replies
  • 0 kudos

Hive Metastore Schema Permissions

I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...

  • 1122 Views
  • 0 replies
  • 0 kudos
AsphaltDataRide
by New Contributor III
  • 31461 Views
  • 10 replies
  • 10 kudos

job event trigger - Invalid credentials for storage location

I want to use an event trigger to start a job. -The MI has the Storage Blob Data Contributor role-Test connection is successful at the level of the external location-I have read permission on the external location-I have owner permission on the job-O...

image.png
  • 31461 Views
  • 10 replies
  • 10 kudos
Latest Reply
adriennn
Valued Contributor
  • 10 kudos

for referencehttps://stackoverflow.com/a/75906376/2842348seems this could be made to work by allowing connectivity from Databricks' private vnets, the same way it is needed currently done for serverless setups if you have an environment that blocks p...

  • 10 kudos
9 More Replies
kaustubhgupta
by New Contributor II
  • 4230 Views
  • 1 replies
  • 0 kudos

How to setup oauth for Databricks connection in Tableau Server?

We extensively use Databricks (DBX) for creating Tableau visuals. Whenever DBX data sources are published on our self hosted Tableau Server, we have to add the connection creds for the data source. These creds can be both personal email id-password o...

  • 4230 Views
  • 1 replies
  • 0 kudos
Latest Reply
kaustubhgupta
New Contributor II
  • 0 kudos

Thanks for your response. While I understood most of it, I still have some doubts to clear. Therefore explaining it again. Problem: We want users to create data sources using Databricks via their personal credentials in the local env (tableau desktop...

  • 0 kudos
TonyUK
by New Contributor II
  • 10323 Views
  • 6 replies
  • 7 kudos

Can I delete hive_metastore on a Unity Catalog Workspace?

Hi, I have a new workspace that was converted to be controlled via Unity Catalog.There is not data stored here as this is not in use yet, so I wanted to ask if it was safe to remove this without breaking the workspace?Thank you

  • 10323 Views
  • 6 replies
  • 7 kudos
Latest Reply
JasonThomas
New Contributor III
  • 7 kudos

As things gravitate towards Unity Catalog, it would indeed be nice to have the the ability to completely remove Hive from existing setups.

  • 7 kudos
5 More Replies
shkelzeen
by New Contributor II
  • 4306 Views
  • 2 replies
  • 0 kudos

Create token for ServicePrincipal using java SDK

Hi TeamI am looking into implementing a functionality which allows me to create a ServicePrincipal and I want to create a personal token for this ServicePrincipal using java sdk. While trying to do this I am getting this error ```com.databricks.sdk.c...

  • 4306 Views
  • 2 replies
  • 0 kudos
Latest Reply
shkelzeen
New Contributor II
  • 0 kudos

Hi Thank you for your response how would the databricks know for which service principal  I want to grant permission?

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels