What's new in workspace Catalog binding
Watch the youtube video : https://www.youtube.com/watch?v=S9LLpMvAcT4
- 1696 Views
- 0 replies
- 1 kudos
Watch the youtube video : https://www.youtube.com/watch?v=S9LLpMvAcT4
As a way to minimize storage costs, my team and I want to create views instead of tables in the Gold layer.We always try to improve the experience of our users by adding comments to the columns. the problem is that views do not inherit comments from ...
Hi @isabelgontijo ,had you found any workaround this?You can use create view statement as mentioned belowIn addition to this,i am not able to add comment on column which is encrypted and in views we are decrypting that column. I am using create view...
Hi all,I'm deploying Unity Catalog into a large enterprise with quite strict controls. I want to give my users some autonomy over creating tables from upstream datasets they have select access on, but also restrict these controls. I've been through a...
A schema contains tables, views, and functions. You create schemas inside catalogs .RequirementsYou must have the USE CATALOG and CREATE SCHEMA data permissions on the schema’s parent catalog. Either a metastore admin or the owner of the catalog can ...
Hi, Is it possible to track and display data lineage with Unity Catalog while using databricks on GCP? like so: https://docs.databricks.com/en/data-governance/unity-catalog/data-lineage.htmlIf it's not yet implemented, then is there any roadmap avai...
Thank you, That's great news!
Dear community,Our company is using Databricks and we are happy to have Unity Catalog emerged to solve part of our Data Governance problems. We are very interested in the Table Insights feature which is newly introduced (View frequent queries and use...
Dear @Kaniz, Thanks for your answer. I just checked and indeed such information can be queried to information_schema tables.Anyways, do you have the information if there exists a plan to expose such insights via REST API, or not at all?The reason I a...
We have been successfully managing access for our unity catalogs using the databricks_grant resources in Terraform. Now we want to enable the Rudderstack integration for Databricks, but that does not support unity catalog and instead put files inside...
@Mattias P​ :Unfortunately, it is not currently possible to manage access to the Hive Metastore catalog (or other external metastores) using the databricks_grant resource in Terraform. This is because the databricks_grant resource is specifically des...
Hi,I was wondering if it would be possible to execute a dlt command such as @dlt.create_table against the execute endpoint API and then materialize the asset?Thank you,Adriana Cavalcanti
Ref.: The Endpoint API that I am referring to is: https://docs.databricks.com/api/workspace/commandexecution/execute
Hi I am new to databricks and trying to connect to Rstudio Server from my all-purpose compute cluster.Here are the cluster configuration:Policy: Personal ComputeAccess mode: Single userDatabricks run time version: 13.2 ML (includes Apache Spark 3.4.0...
Running into this issue as well. Let me know if you found a resolution, @paniz_asghari!
I'm trying to connect to a cluster with Runtime 13.0 and Unity Catalog through databricks-connect version 13.0.0 (for Python). The spark session seems to initialize correctly but anytime I try to use it, I get the following error:{SparkConnectGrpcExc...
I have the same error up to DBR 13.3 LTS. When I upgraded it to 14.0, I was then able to connect with my databricks compute from my local environment.
Hi,I am trying to set up Unity-Catalog for my company and ran into a problem today. Basically, for each new source of data we ingest, we create a view-layer on top of the "tables". We do that because we have pseudonymized information in our datalake ...
One solution I found is, creating a function which does the decryption of the column, and from the view creation, I simply call the function and pass the column.This solution however pushes me to put the decryption key inside the function in plain-te...
I've been working on creating a schema in the Hive Metastore using the following command:spark.sql('CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for me...
I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...
I've been working on creating a schema in the Hive Metastore using the following command:spark.sql(f'CREATE DATABASE IF NOT EXISTS {database}')The schema or database is successfully created, but I encountered an issue where it's only accessible for m...
I want to use an event trigger to start a job. -The MI has the Storage Blob Data Contributor role-Test connection is successful at the level of the external location-I have read permission on the external location-I have owner permission on the job-O...
for referencehttps://stackoverflow.com/a/75906376/2842348seems this could be made to work by allowing connectivity from Databricks' private vnets, the same way it is needed currently done for serverless setups if you have an environment that blocks p...
I am trying to create an external table using csv file which is stored in ADLS gen2 My account owner has created a storage credential and an external location I am a databricks user who all privileges on external location when trying to create a tabl...
Hi @Shubhanshu, To overcome the error and create the table, ensure that the client secret token associated with the Azure Active Directory (Azure AD) application service principal is not expired or invalid. Here are the steps you can follow: 1. Open...
User | Count |
---|---|
77 | |
62 | |
46 | |
27 | |
24 |