cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

isabelgontijo
by New Contributor
  • 5204 Views
  • 1 replies
  • 0 kudos

View column comments

As a way to minimize storage costs, my team and I want to create views instead of tables in the Gold layer.We always try to improve the experience of our users by adding comments to the columns. the problem is that views do not inherit comments from ...

  • 5204 Views
  • 1 replies
  • 0 kudos
Latest Reply
luck_az
New Contributor III
  • 0 kudos

Hi @isabelgontijo ,had you found any workaround this?You can use create view statement as mentioned belowIn addition to this,i am  not able to add comment on column which is encrypted and in views we are decrypting that column. I am using create view...

  • 0 kudos
ajbush
by New Contributor III
  • 2805 Views
  • 1 replies
  • 0 kudos

Unity Catalog permissions approach for strict controls when creating tables

Hi all,I'm deploying Unity Catalog into a large enterprise with quite strict controls. I want to give my users some autonomy over creating tables from upstream datasets they have select access on, but also restrict these controls. I've been through a...

  • 2805 Views
  • 1 replies
  • 0 kudos
Latest Reply
Priyag1
Honored Contributor II
  • 0 kudos

A schema contains tables, views, and functions. You create schemas inside catalogs .RequirementsYou must have the USE CATALOG and CREATE SCHEMA data permissions on the schema’s parent catalog. Either a metastore admin or the owner of the catalog can ...

  • 0 kudos
FreshPork
by New Contributor II
  • 3748 Views
  • 2 replies
  • 2 kudos

Resolved! Data Lineage in Unity Catalog on GCP

Hi, Is it possible to track and display data lineage with Unity Catalog while using databricks on GCP?  like so: https://docs.databricks.com/en/data-governance/unity-catalog/data-lineage.htmlIf it's not yet implemented, then is there any roadmap avai...

Data Governance
Data Lineage
GCP
Unity Catalog
  • 3748 Views
  • 2 replies
  • 2 kudos
Latest Reply
FreshPork
New Contributor II
  • 2 kudos

Thank you, That's great news! 

  • 2 kudos
1 More Replies
TienDat
by New Contributor III
  • 6053 Views
  • 3 replies
  • 2 kudos

Resolved! Unity Catalog: Expose REST API for Table Insights

Dear community,Our company is using Databricks and we are happy to have Unity Catalog emerged to solve part of our Data Governance problems. We are very interested in the Table Insights feature which is newly introduced (View frequent queries and use...

  • 6053 Views
  • 3 replies
  • 2 kudos
Latest Reply
TienDat
New Contributor III
  • 2 kudos

Dear @Kaniz, Thanks for your answer. I just checked and indeed such information can be queried to information_schema tables.Anyways, do you have the information if there exists a plan to expose such insights via REST API, or not at all?The reason I a...

  • 2 kudos
2 More Replies
fuselessmatt
by Contributor
  • 3391 Views
  • 2 replies
  • 1 kudos

Resolved! Is it possible to manage access for legacy catalogs (hive_metastore) in Terraform?

We have been successfully managing access for our unity catalogs using the databricks_grant resources in Terraform. Now we want to enable the Rudderstack integration for Databricks, but that does not support unity catalog and instead put files inside...

  • 3391 Views
  • 2 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Mattias P​ :Unfortunately, it is not currently possible to manage access to the Hive Metastore catalog (or other external metastores) using the databricks_grant resource in Terraform. This is because the databricks_grant resource is specifically des...

  • 1 kudos
1 More Replies
drii_cavalcanti
by New Contributor III
  • 2074 Views
  • 1 replies
  • 0 kudos

Execute Endpoint API with DLT

Hi,I was wondering if it would be possible to execute a dlt command such as @dlt.create_table against the execute endpoint API and then materialize the asset?Thank you,Adriana Cavalcanti

  • 2074 Views
  • 1 replies
  • 0 kudos
Latest Reply
drii_cavalcanti
New Contributor III
  • 0 kudos

Ref.: The Endpoint API that I am referring to is: https://docs.databricks.com/api/workspace/commandexecution/execute

  • 0 kudos
paniz_asghari
by New Contributor
  • 1955 Views
  • 1 replies
  • 0 kudos

SparkException: There is no Credential Scope.

Hi I am new to databricks and trying to connect to Rstudio Server from my all-purpose compute cluster.Here are the cluster configuration:Policy: Personal ComputeAccess mode: Single userDatabricks run time version: 13.2 ML (includes Apache Spark 3.4.0...

  • 1955 Views
  • 1 replies
  • 0 kudos
Latest Reply
kunalmishra9
New Contributor III
  • 0 kudos

Running into this issue as well. Let me know if you found a resolution, @paniz_asghari!  

  • 0 kudos
carlafernandez
by New Contributor III
  • 26624 Views
  • 10 replies
  • 6 kudos

Resolved! Databricks-connect version 13.0.0 throws Exception with details = "Missing required field 'UserContext' in the request."

I'm trying to connect to a cluster with Runtime 13.0 and Unity Catalog through databricks-connect version 13.0.0 (for Python). The spark session seems to initialize correctly but anytime I try to use it, I get the following error:{SparkConnectGrpcExc...

  • 26624 Views
  • 10 replies
  • 6 kudos
Latest Reply
redperiabras
New Contributor II
  • 6 kudos

I have the same error up to DBR 13.3 LTS. When I upgraded it to 14.0, I was then able to connect with my databricks compute from my local environment.

  • 6 kudos
9 More Replies
BMex
by New Contributor III
  • 2388 Views
  • 1 replies
  • 0 kudos

Resolved! Hide VIEW definition in Unity-Catalog

Hi,I am trying to set up Unity-Catalog for my company and ran into a problem today. Basically, for each new source of data we ingest, we create a view-layer on top of the "tables". We do that because we have pseudonymized information in our datalake ...

BMex_1-1695113477735.png BMex_0-1695113280990.png
  • 2388 Views
  • 1 replies
  • 0 kudos
Latest Reply
BMex
New Contributor III
  • 0 kudos

One solution I found is, creating a function which does the decryption of the column, and from the view creation, I simply call the function and pass the column.This solution however pushes me to put the decryption key inside the function in plain-te...

  • 0 kudos
drii_cavalcanti
by New Contributor III
  • 95 Views
  • 0 replies
  • 0 kudos

Permission on Hive Metastore DBX 10.4

I've been working on creating a schema in the Hive Metastore using the following command:spark.sql('CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for me...

Data Governance
clusters
hive_metastore
legacy
permission
  • 95 Views
  • 0 replies
  • 0 kudos
drii_cavalcanti
by New Contributor III
  • 98 Views
  • 0 replies
  • 0 kudos

Hive Metastore permission on DBX 10.4 Cluster

I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...

  • 98 Views
  • 0 replies
  • 0 kudos
drii_cavalcanti
by New Contributor III
  • 75 Views
  • 0 replies
  • 0 kudos

Hive Metastore Schema Permissions

I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...

  • 75 Views
  • 0 replies
  • 0 kudos
AsphaltDataRide
by New Contributor III
  • 15604 Views
  • 10 replies
  • 10 kudos

job event trigger - Invalid credentials for storage location

I want to use an event trigger to start a job. -The MI has the Storage Blob Data Contributor role-Test connection is successful at the level of the external location-I have read permission on the external location-I have owner permission on the job-O...

image.png
  • 15604 Views
  • 10 replies
  • 10 kudos
Latest Reply
adriennn
Contributor
  • 10 kudos

for referencehttps://stackoverflow.com/a/75906376/2842348seems this could be made to work by allowing connectivity from Databricks' private vnets, the same way it is needed currently done for serverless setups if you have an environment that blocks p...

  • 10 kudos
9 More Replies
Shubhanshu
by New Contributor II
  • 2541 Views
  • 1 replies
  • 0 kudos

Unable to create external table

I am trying to create an external table using csv file which is stored in ADLS gen2 My account owner has created a storage credential and an external location I am a databricks user who all privileges on external location when trying to create a tabl...

  • 2541 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Shubhanshu, To overcome the error and create the table, ensure that the client secret token associated with the Azure Active Directory (Azure AD) application service principal is not expired or invalid. Here are the steps you can follow: 1. Open...

  • 0 kudos
Labels