cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Dp15
by Contributor
  • 4615 Views
  • 3 replies
  • 3 kudos

Resolved! Refresh a External table metadata

Hi,I have an external table which is created out of a S3 bucket. The first time I am creating the table I am using the following command : query = """CREATE TABLE IF NOT EXISTS catalog.schema.external_table_s3           USING PARQUET            LOCAT...

  • 4615 Views
  • 3 replies
  • 3 kudos
Latest Reply
Dp15
Contributor
  • 3 kudos

Hi @Kaniz thank you for reply, how can we handle the schema changes in the external location, what if there are additions or deletions on the schema, will the refresh table work then too? 

  • 3 kudos
2 More Replies
Mado
by Valued Contributor II
  • 26845 Views
  • 7 replies
  • 1 kudos

Resolved! Databricks Audit Logs, Where can I find table usage information or queries?

Hi,I want to access the Databricks Audit Logs to check the table usage information.I created a Databricks workspace on the premium pricing tier and enabled it for the Unity Catalogue.I configured Audit logs to be sent to Azure Diagnostic log delivery...

image image
  • 26845 Views
  • 7 replies
  • 1 kudos
Latest Reply
Mado
Valued Contributor II
  • 1 kudos

Thanks @Suteja Kanuri​ Could you guide me on how to setup and configure Table Access Control (TAC)?

  • 1 kudos
6 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 3026 Views
  • 1 replies
  • 1 kudos

Enable lineage data system tables

Execute the below Python code in your databricks workspace to enable lineage system tables import requests ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext() api_url = ctx.tags().get("browserHostName").get() api_token = ctx.apiTo...

ezgif-1-9a34272eba.gif
  • 3026 Views
  • 1 replies
  • 1 kudos
Latest Reply
iplantevin
New Contributor III
  • 1 kudos

Great, I wondered why I didn't see them

  • 1 kudos
RamaTeja
by New Contributor II
  • 7909 Views
  • 6 replies
  • 0 kudos

In Azure Databricks Delta Lake not able to see unity catalog databases or tables in the drop down.

I have created an Azure data factory pipeline with a copy data function to copy data from adls path to a delta table .In the delta table drop downs i am able to see only the hive metastore database and tables only but the unity catalog tables are not...

  • 7909 Views
  • 6 replies
  • 0 kudos
Latest Reply
latteuro
New Contributor II
  • 0 kudos

Hi, I have the same issue.Additional information: the linked service created in azure data factory using azure databricks deltalake connector is using system-managed-identity rather than Token. Could we have an update? Thank you in advance. 

  • 0 kudos
5 More Replies
karthik_p
by Esteemed Contributor
  • 1908 Views
  • 1 replies
  • 1 kudos

Managed Loactions overlap Issue DLT UC

Did any one tried UC DLT, we are trying to create multiple catalogs for UC DLTnote: as per limitations it won't support Locations, supports managed locationsOur use case: we want to create 2 catalogs one for dev and other for production. Dev catalog ...

  • 1908 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @karthik_p,  It’s important to note that DLT pipelines can read from anywhere but cannot create DLT tables in any place outside of the catalog/schema specified in the pipeline settings. If you’re trying to write to both locations in the same pipel...

  • 1 kudos
-werners-
by Esteemed Contributor III
  • 9154 Views
  • 8 replies
  • 2 kudos

Resolved! multiple storage credentials/external locations to same physical location

Hi all,we are in the process of rolling out a new unity-enabled databricks env with 2 tiers: dev and prod.Initially we had the plan to completely decouple dev and prod, each with their own data lake as storage.While this is the safest option, it does...

  • 9154 Views
  • 8 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

That is the way I am working at right now.  Assign workspace to the catalog and set to read-only if necessary.It would be easier though if it was possible to define a 2nd external location in read-only, as this cannot break anything (of course in rea...

  • 2 kudos
7 More Replies
TienDat
by New Contributor III
  • 7409 Views
  • 5 replies
  • 0 kudos

Filter sensitive data on nested column

Dear all,We are working on the column masking topic recently, using the column filter feature of Unity Catalog.We recently face the problem of masking a nested column (a sub-column within a STRUCT type column).We just wonder if this is even possible ...

  • 7409 Views
  • 5 replies
  • 0 kudos
Latest Reply
DucNguyen
New Contributor II
  • 0 kudos

Same as my concern. I try to mask with Decimal datatype but It doesn't work as well. The example of DBX for column mask maybe work well with simple datatype like string. Somehow it doesn't meet our requirements for data governance.

  • 0 kudos
4 More Replies
Kavi_007
by New Contributor III
  • 6865 Views
  • 5 replies
  • 1 kudos

Resolved! Unable to open the account console

I'm the account admin for the subscription where the databricks workspace is created. However If I open the account console, its prompting me to select one of the databricks workspaces. Please note that its my own subscription via visual studio benef...

  • 6865 Views
  • 5 replies
  • 1 kudos
Latest Reply
Kavi_007
New Contributor III
  • 1 kudos

If there is an existing metastore for the region, doesn't mean my organization store all the meta data across the workspaces in a specific metastore or data lake ? Databricks suggests to have one metastore per region. How the project specific metadat...

  • 1 kudos
4 More Replies
Pritesh2
by New Contributor II
  • 1669 Views
  • 0 replies
  • 0 kudos

How Can Companies Organize Their Data Through Data Governance?

When managing and safeguarding your internal data, data governance is crucial. It works like insurance, ensuring that all the data you gather is appropriately disseminated and kept safe within your company.Let's discuss data governance and how to imp...

Tech2_0-1702985203853.png Tech2_1-1702985203952.png Tech2_2-1702985203894.png
Data Governance
data engineering
Data Governance
  • 1669 Views
  • 0 replies
  • 0 kudos
Kris2
by New Contributor II
  • 2625 Views
  • 1 replies
  • 0 kudos

Unable to create a Managed table in Unity Catalog default location

We have setup the metastore with Manged Identity and when trying to create a managed table in the default location I am hitting below error. The storage is ADLS Gen2. AbfsRestOperationException: Operation failed: "This request is not authorized to pe...

  • 2625 Views
  • 1 replies
  • 0 kudos
Latest Reply
RebeccaVC
New Contributor II
  • 0 kudos

Have you followed all the steps in here to create the metastore? https://app.getreprise.com/launch/wy1Y2ly/Do you have all the necessary permissions granted to create a Managed Table? https://docs.databricks.com/en/data-governance/unity-catalog/manag...

  • 0 kudos
Amber-26
by New Contributor III
  • 4694 Views
  • 4 replies
  • 0 kudos

System Tables not showing up in PowerBI

Hello,I am currently working with System Tables on Unity Catalogue part. I have loaded all the schemas in the catalogue and I am using PowerBI to directly access these tables. But while connecting PowerBI to Databricks, I am not able to see System Ta...

Amber26_0-1699456599708.png
  • 4694 Views
  • 4 replies
  • 0 kudos
Latest Reply
Chinu
New Contributor III
  • 0 kudos

I had/have the same issue with the Tableau desktop. I'm not able to select the "Billing" schema because I don't see the "System" catalog. However, I found a workaround to resolve this issue. In databricks console, go to the "Catalog Explore" and sele...

  • 0 kudos
3 More Replies
Gilg
by Contributor II
  • 1792 Views
  • 3 replies
  • 1 kudos

Unity Catalog and Instance Pool

Hi Team,Does Instance Pool supported in Unity Catalog enabled workspace?Cheers,G

  • 1792 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Gilg, In an Azure Databricks workspace enabled for Unity Catalog, you can leverage the power of Unity Catalog to manage data access and identity federation. Here are the key points: Unity Catalog Enablement: When you enable a workspace for Un...

  • 1 kudos
2 More Replies
AdamMcGuinness
by New Contributor III
  • 18321 Views
  • 4 replies
  • 1 kudos

Metastore - One per Account/Region Limitation

Looking at Databricks’ suggested use of catalogs. My instincts are now leading me to the conclusion having separate metastore for each SDLC environment (dev, test, prod) is preferable. I think if this pattern were followed, this means due to current ...

  • 18321 Views
  • 4 replies
  • 1 kudos
Latest Reply
SSundaram
Contributor
  • 1 kudos

You can create multiple metastores for each region within an account. This is not a hard constraint, reach out to account team and they can make an exception. Before doing that, consider what kind of securable sharing you will need between dev, test ...

  • 1 kudos
3 More Replies
marc88
by New Contributor II
  • 2619 Views
  • 2 replies
  • 0 kudos

Only read access to databricks delta tables

Hi,Is it possible to make delta shares work for a Matlab client (Just the way we share delta tables to a PowerBI workbench)?We have Unity Catalog enabled and I wanted to explore more about data governance with aforesaid integration.Wanted to discuss ...

  • 2619 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @marc88, Certainly! Let’s explore the integration of Delta Sharing and Unity Catalog with a focus on sharing data with a Matlab client.    Here are some insights:   Unity Catalog and Data Governance: Unity Catalog is a powerful solution for data g...

  • 0 kudos
1 More Replies
stiaangerber
by New Contributor III
  • 3952 Views
  • 4 replies
  • 2 kudos

MLlib load from UC Volume: IllegalArgumentException: Cannot access the UC Volume path...

I'm trying to store MLlib instances in Unity Catalog Volumes. I think volumes are a great way to keep things organized.I can save to a volume without any issues and I can access the data using spark.read and with plain python open(). However, when I ...

  • 3952 Views
  • 4 replies
  • 2 kudos
Latest Reply
slimexy
New Contributor II
  • 2 kudos

Just to supplement that if the ML model is saved and then loaded within the same execution, calling load() will not cause the mentioned exception. Copying the model directory from UC volume to ephemeral storage attached to the driver node is also a w...

  • 2 kudos
3 More Replies
Labels