cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 3338 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Data Governance | Navigate the explosion of AI, data and tools

Here's your Data + AI Summit 2024 - Data Governance recap as you navigate the explosion of AI, data and tools in efforts to build a flexible and scalable governance framework that spans your entire data and AI estate. Keynote: Evolving Data Governan...

Screenshot 2024-07-03 at 9.27.29 AM.png
  • 3338 Views
  • 0 replies
  • 0 kudos
Bob_Rid
by New Contributor II
  • 1258 Views
  • 3 replies
  • 2 kudos

Resolved! IPAuthorization error

Is there a feature update request or resolution for adding service endpoints to the worker-vn that databricks created? We are experiencing 403 errors for IP authorizations from the worker-vn and do not have the permissions to update the service endpo...

  • 1258 Views
  • 3 replies
  • 2 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 2 kudos

Great to know! 

  • 2 kudos
2 More Replies
animadurkar
by New Contributor III
  • 17579 Views
  • 7 replies
  • 3 kudos

Resolved! Unable to query Unity Catalog tables from notebooks.

In my workspace, I'm able to see the unity catalogs my team has created. I'm able to see the schemas and even query data in there and create views using the SQL Editor.When I go to write the same sql queries from a notebook using spark.sql or %sql I ...

  • 17579 Views
  • 7 replies
  • 3 kudos
Latest Reply
KIRKQUINBAR
New Contributor III
  • 3 kudos

i know this is an old topic, but i have the same issue with querying data using the sql editor and a sql warehouse. my sql warehouse does have unity catalog enabled. not sure why it wouldnt be working.

  • 3 kudos
6 More Replies
KristiLogos
by Contributor
  • 4037 Views
  • 1 replies
  • 0 kudos

Salesforce to Databricks connection

I've been followig this documentation to get Salesforce Data Cloud connection setup into Databrickshttps://learn.microsoft.com/en-us/azure/databricks/query-federation/salesforce-data-cloudI've added the client id and secret id and the scope, and it s...

Screenshot 2024-11-07 090802.png
  • 4037 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Databricks Employee
  • 0 kudos

Hi @KristiLogos , Good Day!  We understand that you are facing the following error while you are trying to create a connection with Salesforce, but since it's been a long time, we wanted to check if you are still facing the issue or if it's resolved ...

  • 0 kudos
wpcreative
by New Contributor
  • 5470 Views
  • 3 replies
  • 0 kudos

Optimizing Your Site’s Branded Search - 5 Simple Ways Trusted by Every SEO Agency Sydney  What is Branded searching Simply told, brand searches occur ...

Optimizing Your Site’s Branded Search - 5 Simple Ways Trusted by Every SEO Agency SydneyWhat is Branded searching Simply told, brand searches occur when individuals type your WP Creative in search engines like Google, Yahoo etc. Since customers are d...

  • 5470 Views
  • 3 replies
  • 0 kudos
Latest Reply
Devidrich
New Contributor II
  • 0 kudos

HikeMyTraffic® provided excellent insights on Optimizing Your Site’s Branded Search. Their approach, along with strategies from SeoProfy and Livepage, made it easier to boost brand visibility. Highly recommend their expertise in enhancing branded sea...

  • 0 kudos
2 More Replies
212455
by New Contributor
  • 4482 Views
  • 3 replies
  • 0 kudos

Resolved! List Service Principal OBO Tokens

I am trying to list OBO tokens that have been created for service principals. I tried using the Token Management API 2.0 (https://docs.databricks.com/dev-tools/api/latest/token-management.html#operation/get-tokens) to list workspace tokens, but it on...

  • 4482 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Nick Tran​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...

  • 0 kudos
2 More Replies
youssefmrini
by Databricks Employee
  • 3090 Views
  • 1 replies
  • 1 kudos

What's new in workspace Catalog binding

Watch the youtube video : https://www.youtube.com/watch?v=S9LLpMvAcT4

  • 3090 Views
  • 1 replies
  • 1 kudos
Latest Reply
jhonm_839
New Contributor III
  • 1 kudos

This is a great video! It's really helpful to see how to control access to catalogs in Unity Catalog. I especially like the part about granting read-only access to production data. This is a great way to allow data analysts to do their work without g...

  • 1 kudos
sandeephenkel23
by New Contributor III
  • 6752 Views
  • 2 replies
  • 0 kudos

SELECT SCHEMA/USE_SCHEMA WHICH IS MOST SUITABLE FOR ACCESS

Hi,As part of the DataGovernance or Authorization topic we are working on automation of the code for granting the access CATALOG LEVEL,SCHEMA LEVEL and TABLE LEVEL in Unity CatalogAs USE CATALOG Provides access at the Catalog level to user/group(whic...

Data Governance
Select Schema and Use_Schema
  • 6752 Views
  • 2 replies
  • 0 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 0 kudos

USE is foundational. You need it at the catalog level and for any schema you want to access. So let's say I need SELECT only on 1 table T1 in Catalog:C1, Schema:S1 (Though there may be 1K tables in that schema) Then I would need USE CATALOG on C1;USE...

  • 0 kudos
1 More Replies
giohappy
by New Contributor III
  • 5636 Views
  • 3 replies
  • 1 kudos

Can we assume the path to the managed tables in the hive_metastore is reliable?

Managed tables are stored under the /user/hive/warehouse, which is also mentioned in the documentation. In our workflow, we use that path to read the parquet files from outside (through databricks connector). Can we assume this path is reliable, or i...

  • 5636 Views
  • 3 replies
  • 1 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 1 kudos

That path is reliable but we would recommend not using that path in general. That's your workspace root storage. Your data should be in a cloud path of your choosing (s3/adls/gcs) so that you can separate your data out by BU/Project/team etc based on...

  • 1 kudos
2 More Replies
prasad_vaze
by New Contributor III
  • 6225 Views
  • 1 replies
  • 0 kudos

Is there a way to import table , column descriptions into unity catalog?

I have a spreadheet containing table & column descriptions (comments) Is there a way to upload this against the schema in unity catalog?  Basically instead of running  'Alter table <> alter column <>  comment "description"  '  command for every colum...

Data Governance
Unity Catalog
  • 6225 Views
  • 1 replies
  • 0 kudos
Latest Reply
sourav69201
Databricks Employee
  • 0 kudos

One of the ways to do it is to create a delta table using the Spreadsheet and loop through the delta table. Then, use the field value and create a dynamically alter query to update the description.

  • 0 kudos
Dp15
by Contributor
  • 13289 Views
  • 2 replies
  • 1 kudos

Refresh a External table metadata

Hi,I have an external table which is created out of a S3 bucket. The first time I am creating the table I am using the following command : query = """CREATE TABLE IF NOT EXISTS catalog.schema.external_table_s3           USING PARQUET            LOCAT...

  • 13289 Views
  • 2 replies
  • 1 kudos
Latest Reply
cgrant
Databricks Employee
  • 1 kudos

Please try partition discovery for external tables. This feature should make it so that you can successfully run the MSCK REPAIR command, and more importantly, query external Parquet tables in a more performant way.

  • 1 kudos
1 More Replies
RohithChippa
by New Contributor III
  • 1410 Views
  • 2 replies
  • 0 kudos

Issue Adding Foreign Catalog Tables to Data Clean Room

Hi,I tried testing by adding a Snowflake table through Databricks foreign catalog into a Data Clean Room. I was able to:Establish the connection.Query the data in Databricks successfully.However, I was unable to see the catalog when trying to add dat...

Data Governance
datacleanroom
  • 1410 Views
  • 2 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@RohithChippa Please check if you have the requirements to use clean room.To be eligible to use clean rooms, you must: Have an account that is enabled for serverless compute. See Enable serverless compute. Have a workspace that is enabled for Unity...

  • 0 kudos
1 More Replies
Abishrp
by Contributor
  • 2248 Views
  • 9 replies
  • 4 kudos

Resolved! Issue in system.compute.node_timeline table

For some of my clusters i can able to find data in  system.compute.node_timeline table to view utilization details even after 36 hrs since the run.

  • 2248 Views
  • 9 replies
  • 4 kudos
Latest Reply
Walter_C
Databricks Employee
  • 4 kudos

Yes seems that the runs are 1 minute of execution so this might be the reason why the metrics are not loaded

  • 4 kudos
8 More Replies
GeoPer
by New Contributor III
  • 1110 Views
  • 3 replies
  • 0 kudos

Shared Access mode cluster FAILS to write data to BigQuery

We try to migrate our old infra to Unity Catalog.We have some pipelines which write to BigQuery tables.To enable Unity Catalog to cluster level we have 2 options (Single user and Shared).Unfortunately we tried by using a Shared (Access mode) cluster ...

  • 1110 Views
  • 3 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

If you do a SHOW GRANTS on this cluster what does it shows? if you have SELECT on ANY File, then probably you just need to grant MODIFY on any file as described here https://docs.databricks.com/en/data-governance/table-acls/any-file.html 

  • 0 kudos
2 More Replies
F_Goudarzi
by New Contributor III
  • 3920 Views
  • 2 replies
  • 0 kudos

Naming conventions for delta sharing

Hi All,A question for those using Delta Sharing, how are you defining naming conventions for Shares and Recipient names? What best practices or standards are you following? Thanks, Fatima

  • 3920 Views
  • 2 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

For platform-to-platform environments, I nest all shared tables under a specially named Schema with a prefix of DS-.  For example, my schema name could be ds-mfgschema. This is a 'flag' that any tables nested under this schema are being delta shared ...

  • 0 kudos
1 More Replies
santos_saenz
by New Contributor II
  • 1006 Views
  • 2 replies
  • 1 kudos

Search Capabilities in Unity Catalog

I am exploring the search capabilities available in Unity Catalog. Specifically, I want to know if it is possible for a user to discover the existence of a table and its owner via search, even if they don't have access to the table's data. This featu...

  • 1006 Views
  • 2 replies
  • 1 kudos
Latest Reply
santos_saenz
New Contributor II
  • 1 kudos

I just found out that with the BROWSE privilege on the catalog this can be done

  • 1 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels