cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 4059 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Data Governance | Navigate the explosion of AI, data and tools

Here's your Data + AI Summit 2024 - Data Governance recap as you navigate the explosion of AI, data and tools in efforts to build a flexible and scalable governance framework that spans your entire data and AI estate. Keynote: Evolving Data Governan...

Screenshot 2024-07-03 at 9.27.29 AM.png
  • 4059 Views
  • 0 replies
  • 0 kudos
isabelgontijo
by New Contributor II
  • 9730 Views
  • 2 replies
  • 0 kudos

View column comments

As a way to minimize storage costs, my team and I want to create views instead of tables in the Gold layer.We always try to improve the experience of our users by adding comments to the columns. the problem is that views do not inherit comments from ...

  • 9730 Views
  • 2 replies
  • 0 kudos
Latest Reply
DavidThomasDBX
Databricks Employee
  • 0 kudos

you can use COMMENT ON to set or update the comment on an view-- View-level comment (use TABLE for views)COMMENT ON TABLE catalog.schema.my_view IS 'View for user info';-- Column-level comment on a viewCOMMENT ON COLUMN catalog.schema.my_view.name IS...

  • 0 kudos
1 More Replies
ShaSap
by New Contributor II
  • 5773 Views
  • 6 replies
  • 1 kudos

Connect Purview to AWS Databricks Unity Catalog?

Hello!  Newbie here, so apologies if this is a super basic question.  Trying to figure out if we can connect Purview to an AWS instance of Databricks (vice Azure instance), but I have only seen articles on connecting Azure Databricks to Purview.  I r...

  • 5773 Views
  • 6 replies
  • 1 kudos
Latest Reply
malinadiego123
New Contributor III
  • 1 kudos

@ShaSap wrote:Hello!  Newbie here, so apologies if this is a super basic question.  Trying to figure out if we can connect Purview to an AWS instance of Databricks (vice Azure instance), but I have only seen articles on connecting Azure Databricks to...

  • 1 kudos
5 More Replies
Fikrat
by Databricks Partner
  • 1173 Views
  • 3 replies
  • 2 kudos

Resolved! Encryption for UC managed tables on AWS based databricks

Hi folks,How encryption is handled for UC managed tables on AWS- based Databricks account? I found this post: https://docs.databricks.com/aws/en/security/keys/ , which describes how workspace file system (dbfs) and control panel data is encrypted, bu...

  • 1173 Views
  • 3 replies
  • 2 kudos
Latest Reply
Fikrat
Databricks Partner
  • 2 kudos

Thanks for clarifications, @szymon_dybczak!

  • 2 kudos
2 More Replies
tana_sakakimiya
by Contributor
  • 721 Views
  • 1 replies
  • 1 kudos

Resolved! Any Hint to view artifact?

I stored my artifact of model in volume.I can view the object via catalog page.however, when i open from experiment > artifact, i can't access to view my artifact with error failed to fetch.Any idea to solve the problem?

  • 721 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Community Manager
  • 1 kudos

Hello @tana_sakakimiya! Artifacts stored in Volumes can’t be viewed directly in the MLflow experiment UI, as it only supports displaying artifacts saved to DBFS. You can instead access or download artifacts through Catalog Explorer or using mlflow.ar...

  • 1 kudos
APJESK
by Contributor
  • 1884 Views
  • 2 replies
  • 4 kudos

Clarification on Unity Catalog Metastore - Metadata and storage

Where does the Unity Catalog metastore- metadata actually reside?Is it stored and managed in the Databricks account (control plane)?Or does it get stored in the customer-managed S3 bucket when we create a bucket for Unity Catalog metastore?I want to ...

  • 1884 Views
  • 2 replies
  • 4 kudos
Latest Reply
nayan_wylde
Esteemed Contributor II
  • 4 kudos

1. Unity Catalog metadata (schemas, tables, permissions, lineage, etc.) is stored and managed in the Databricks control plane, not in your S3 bucket. This metadata resides in a Databricks-managed database and is not in your customer-managed storage.2...

  • 4 kudos
1 More Replies
boitumelodikoko
by Databricks Partner
  • 975 Views
  • 1 replies
  • 0 kudos

Permission Request Error – MODIFY Not Assignable in Unity Catalog

Hi everyone,I'm running into an issue when requesting permissions on a Unity Catalog table. Specifically, when trying to request SELECT and MODIFY privileges on a specific table, using Request for Access in Unity Catalog. I get the following error:th...

boitumelodikoko_1-1759227752293.png
  • 975 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @boitumelodikoko ,Weird, it looks like a bug. MODIFY is a valid permission for table object. You can check it yourself in docs:Unity Catalog privileges and securable objects | Databricks on AWSModify allows user to add, update, and delete data to ...

  • 0 kudos
RicardoCauduro
by New Contributor III
  • 4923 Views
  • 5 replies
  • 8 kudos

Resolved! Unity catalog not visible

I'm currently working with Databricks within Azure and encountering an issue with Unity Catalog in a newly created workspace.My colleague initially deployed a Databricks workspace where Unity Catalog was provisioned and enabled by default. I recently...

  • 4923 Views
  • 5 replies
  • 8 kudos
Latest Reply
Khaja_Zaffer
Esteemed Contributor
  • 8 kudos

Hello like szymon suggested just try to check with the admin also as you share vnet injected workspace was it done recently if yes! ThenMight be you were missing two private endpoint of  storage account you needed to create 2 private endpoints, both ...

  • 8 kudos
4 More Replies
ajai_duraisamy
by New Contributor
  • 599 Views
  • 1 replies
  • 1 kudos

Unity catalogue sync problem

Hi Everyone ,while I am creating sync table in UC, I got an error and failed . then I am not able to recreate the same table.when I checked the catalogue table name is there but not able to open , and I am querying the same in the notebook .it's givi...

  • 599 Views
  • 1 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor II
  • 1 kudos

@ajai_duraisamy Here are few checks you can do:1. SHOW TABLES IN <catalog>.<schema> LIKE '<table>'; This command will show if the table exists in UC.2. Make sure your compute is UC‑enabled.If you’re on a cluster/SQL Warehouse without UC enabled (or c...

  • 1 kudos
CarlosAlberto
by New Contributor
  • 1513 Views
  • 1 replies
  • 1 kudos

Resolved! Cannot set spark.plugins com.nvidia.spark.SQLPlugin config

I'm trying to use the Spark-Nvidia integrations in order to train Spark ML models using GPU.While trying to follow these instructions: https://docs.nvidia.com/spark-rapids/user-guide/23.12.1/getting-started/databricks.htmlI could not execute the init...

  • 1513 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

Since cluster initialization happens before the "libraries" section installs Maven artifacts, the plugin isn’t available at the required time, causing the error. Workaround Strategies 1. Internal Artifactory or DBFS Manual Upload Upload the RAPIDS j...

  • 1 kudos
tana_sakakimiya
by Contributor
  • 1092 Views
  • 1 replies
  • 0 kudos

Resolved! Does AWS Databricks comply with Japanese FISC standards

Is there anyone can provide reference that AWS Databricks comply with FISC (The Center for Financial Industry Information Systems) standard which is a standard for financial industry in Japan?I can't find any official information yet.or if you find i...

  • 1092 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @tana_sakakimiya ,As you said, there's no official documentation on databricks regarding this.But when we consider below FAQ from AWS FISC docs page we can see that from audit perspective they will focus mainly on SOC 1, SOC 2 and SOC 3And those a...

  • 0 kudos
kktim
by New Contributor II
  • 4059 Views
  • 2 replies
  • 1 kudos

Resolved! Accessing unity catalog volumes from a databricks web application

Hello,I am trying to deploy a gradio app (app.py) in databricks, but I am having problem accessing data stored in a volume in unity catalog. It seems like that I cannot access the data using path like "/Volumes/catalog/schema.../my_data", which works...

  • 4059 Views
  • 2 replies
  • 1 kudos
Latest Reply
MMRDUS
New Contributor II
  • 1 kudos

Unlike notebooks, Databricks Apps does not support mounting Unity Catalog volumes and directly reading and writing files. As this code snippet demonstrates, each file needs to be downloaded to the app compute before being able to manipulate it.

  • 1 kudos
1 More Replies
Charansai
by New Contributor III
  • 3666 Views
  • 1 replies
  • 1 kudos

Resolved! Best Governance Practice for Providing Access to Production Catalogs in Lower Environments (UC)

Hi everyone,I'm a Cloud Engineer working on a multi-environment Databricks setup (Dev, QA, Prod), and I've received a request from our Data Engineering team that's a bit unconventional — they are asking for access to Production Unity Catalogs from lo...

  • 3666 Views
  • 1 replies
  • 1 kudos
Latest Reply
Sai_Ponugoti
Databricks Employee
  • 1 kudos

Hi @Charansai ,That's a great question! In general, granting Dev/QA users direct access to Production catalogs is not considered best practice. The main risks you already mentioned (governance, compliance, and accidental writes) usually outweigh the ...

  • 1 kudos
Akash30307
by New Contributor II
  • 1676 Views
  • 4 replies
  • 0 kudos

Can we deploy databricks notebooks using databricks asset bundles ( DABs )?

I gone through resources supported by DABs but cannot find notebooks deployment via DABs. Is it any possible way we can deploy notebooks using DABs? What the alternative approach?

  • 1676 Views
  • 4 replies
  • 0 kudos
Latest Reply
Sai_Ponugoti
Databricks Employee
  • 0 kudos

Hi @Akash30307 ,Thank you for your question!You can also deploy notebooks via our Terraform provider.You can find more information on our website.You can also work with databricks_notebook and databricks_notebook_paths data sources.

  • 0 kudos
3 More Replies
ReyCMFG
by New Contributor II
  • 54653 Views
  • 11 replies
  • 10 kudos

Can you use Managed Identities in databricks besides Unity Catalog

We are looking to send messages using databricks to an azure service bus topic and would like to connect to the service bus using a managed identity vs a connection string. Is this possible in databricks. The only thing I could find regarding datab...

  • 54653 Views
  • 11 replies
  • 10 kudos
Latest Reply
yenneprem
New Contributor II
  • 10 kudos

Is there any update to use managed identity instead of dbmanaged identity?

  • 10 kudos
10 More Replies
ck7007
by Contributor II
  • 2163 Views
  • 4 replies
  • 3 kudos

Resolved! Achieved 87% Query Performance Improvement with Custom Zonemap Indexing

Problem: Queries on our 100M+ record Iceberg tables were taking 45+ seconds.Solution: Implemented lightweight zonemap indexing that tracks min/max values per file.Quick Implementationdef apply_zonemap_pruning(table_path, predicate_value):# Load zonem...

  • 2163 Views
  • 4 replies
  • 3 kudos
Latest Reply
Isi
Honored Contributor III
  • 3 kudos

Hi! @ck7007 @WiliamRosa ,I have a question — why are you actually doing this? I’m not fully familiar with your exact setup (Iceberg), but my understanding is that Iceberg already stores these stats (min/max) in the manifests, and Spark should be able...

  • 3 kudos
3 More Replies
Labels