cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

nccodemonkey
by New Contributor III
  • 24911 Views
  • 18 replies
  • 9 kudos

Unity Catalog Shared Access Mode - dbutils.notebook.entry_point...getContext() not whitelisted

we are switching over to Unity Catalog and attempting to confirm the ability to run our existing notebooks. I have created a new Shared Unity Catalog Cluster and ran the notebook using the new cluster. Ran into an error attempting to execute a prin...

  • 24911 Views
  • 18 replies
  • 9 kudos
Latest Reply
JakubMlacki
New Contributor II
  • 9 kudos

I faced the same issue when switching to Shared Access cluster and found that there is a possibility to run :dbutils.notebook.entry_point.getDbutils().notebook().getContext().safeToJson()Hope this helps

  • 9 kudos
17 More Replies
Karlo_Kotarac
by New Contributor II
  • 748 Views
  • 2 replies
  • 0 kudos

Databricks autocomplete uses hive_metastore catalog although we have other default catalog

Databricks autocomplete is checking the hive_metastore catalog when I enter the schema_name and table_name in the notebook although we have other default catalog set at the workspace level. How to make Databricks autocomplete use that catalog when en...

  • 748 Views
  • 2 replies
  • 0 kudos
Latest Reply
Karlo_Kotarac
New Contributor II
  • 0 kudos

Hi @Kaniz ! Thanks for your answer. I forgot to mention that we already have this set up at the cluster level (using spark.databricks.sql.initial.catalog.name variable) besides setting this on the workspace level in the workspace settings but none of...

  • 0 kudos
1 More Replies
Synasenn
by New Contributor II
  • 1388 Views
  • 2 replies
  • 1 kudos

Resolved! Unity catalog; how to remove tags completely?

Hi,We are rolling out Unity in our organization and while playing around I had set some tags on a test catalog.After removing this catalog, the tag keys I used are still present in our Unity environment and I am unable to find where they are stored t...

Synasenn_0-1712216775874.png
  • 1388 Views
  • 2 replies
  • 1 kudos
Latest Reply
Synasenn
New Contributor II
  • 1 kudos

Thanks! I will take that to heart and include it on our internal wiki

  • 1 kudos
1 More Replies
129876
by New Contributor III
  • 2909 Views
  • 8 replies
  • 2 kudos

bamboolib not working correctly

I'm trying to use bamboolib on the 11.0 recommended DBR. I am using the option "Databricks: Load database table" The databases and tables that are on the unity catalog do not populate the dropdown boxes. How can I fix this?

  • 2909 Views
  • 8 replies
  • 2 kudos
Latest Reply
Vidula
Honored Contributor
  • 2 kudos

Hello @k.b.​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 2 kudos
7 More Replies
ReyCMFG
by New Contributor II
  • 20570 Views
  • 10 replies
  • 7 kudos

Can you use Managed Identities in databricks besides Unity Catalog

We are looking to send messages using databricks to an azure service bus topic and would like to connect to the service bus using a managed identity vs a connection string. Is this possible in databricks. The only thing I could find regarding datab...

  • 20570 Views
  • 10 replies
  • 7 kudos
Latest Reply
Yetii
New Contributor II
  • 7 kudos

To be honest, the Databricks team sucks. They should add a simple Identity tab under the Databricks workspace resource as it is done for other Microsoft services. They don't think about making product easier to maintain and configure, they look from ...

  • 7 kudos
9 More Replies
ossinova
by Contributor II
  • 2009 Views
  • 3 replies
  • 0 kudos

Fine grained control of volumes

Is it possible to provide fine grained control (folder level/file level) for a given volume?I have two SCIM integrated groups who have read volume access at the catalog level, but those two groups need different permissions on a lower level. Preferab...

  • 2009 Views
  • 3 replies
  • 0 kudos
Latest Reply
rkalluri-apex
New Contributor III
  • 0 kudos

Can you define the external location at the Landing level and create two Volumes one for PDF and other for CSV and provide access to the respective groups 1 and 2.

  • 0 kudos
2 More Replies
Christine
by Contributor
  • 31711 Views
  • 12 replies
  • 15 kudos

Resolved! Cannot use RDD and cannot set "spark.databricks.pyspark.enablePy4JSecurity false" for cluster

I have been using "rdd.flatMap(lambda x:x)" for a while to create lists from columns however after I have changed the cluster to a Shared acess mode (to use unity catalog) I get the following error: py4j.security.Py4JSecurityException: Method public ...

  • 31711 Views
  • 12 replies
  • 15 kudos
Latest Reply
KandyKad
New Contributor II
  • 15 kudos

Faced this issue multiple times.Solution:1. Don't use Shared Cluster or cluster without Unity Catalog enabled for running 'rdd' queries on Databricks.2. Instead create a Personal Cluster (Single User) with basic configuration and with Unity Catalog e...

  • 15 kudos
11 More Replies
23940829381
by New Contributor II
  • 1589 Views
  • 2 replies
  • 0 kudos

Integrating Unity Catalog ML Models into Data Lineage

I saw a really nice article (https://www.databricks.com/blog/announcing-public-preview-volumes-databricks-unity-catalog) on the incorporation of various elements of data lineage within Unity Catalog. In my own exploration, I've been able to replicate...

  • 1589 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @23940829381, Thank you for sharing your interest in data lineage within Unity Catalog! It’s a powerful feature that allows you to track the flow of data across various elements in your Databricks environment. Let’s delve into this further: Un...

  • 0 kudos
1 More Replies
SergeSmertin
by New Contributor
  • 937 Views
  • 0 replies
  • 0 kudos

Unleashing UCX v0.15.0: A Game-Changer for Upgrading to Unity Catalog

We're thrilled to introduce UCX v0.15.0, packed with cutting-edge features, enhancements, and deprecations that'll take your upgrading experience to new heights!   Release Highlights: AWS S3 support: We've added AWS S3 support to the migrate-location...

Data Governance
Databricks Labs
ucx
  • 937 Views
  • 0 replies
  • 0 kudos
TanushM
by New Contributor
  • 2549 Views
  • 3 replies
  • 0 kudos

Unity Catalog- Informatica Cloud Data Governance & Catalog Integration

We are attempting to create a comprehensive Data Governance solution using Unity Catalog & INFA CDGC tool. The objective is to onboard the business assets on the Informatica DG platform and use the unity catalog to trace technical assets and lineage ...

  • 2549 Views
  • 3 replies
  • 0 kudos
Latest Reply
Starry
New Contributor II
  • 0 kudos

Yes, Informatica now has native integration to Unity Catalog with its cloud data catalogue and governance service. You can scan Unity catalog Metadata and propagate this in CDGC and apply policies, business terms, overlay data quality scores, display...

  • 0 kudos
2 More Replies
Ashley1
by Contributor
  • 7005 Views
  • 8 replies
  • 9 kudos

Backup Unity Catalog and managed tables

Hi All, Can anyone point me to either documentation or personally tried and tested method of backing up (and restoring) Unity Catalog and its associated managed tables? We're running on Azure and using ADLS Gen2.Regards,Ashley

  • 7005 Views
  • 8 replies
  • 9 kudos
Latest Reply
prasad_vaze
New Contributor III
  • 9 kudos

Our UC managed  tables are stored on  prod ADLS storage which is different from UC root storage account.  So what's the best way to backup and restore UC managed tables into different region?   One option is deep clone tables, copy ADLS folders to an...

  • 9 kudos
7 More Replies
Yevhen
by New Contributor II
  • 2319 Views
  • 1 replies
  • 1 kudos

Metastore ErrorClass=QUOTA_EXCEEDED limit: 100000

I faced with error "[RequestId=b567c3bb-05fc-4ac8-b4db-2d826b2c00be ErrorClass=QUOTA_EXCEEDED] Cannot create 1 Table(s) in Metastore 95157916-5539-4dc4-a7d6-e9a854ba0591 (estimated count: 100088, limit: 100000). " but I didn't  see  here this limit  ...

  • 2319 Views
  • 1 replies
  • 1 kudos
Latest Reply
arpit
Contributor III
  • 1 kudos

@Yevhen Here is the doc for quota limit: https://docs.databricks.com/en/data-governance/unity-catalog/index.html#resource-quotas

  • 1 kudos
Yulei
by New Contributor III
  • 2828 Views
  • 3 replies
  • 1 kudos

Resolved! Migrate existing Metastore to a new Metastore in same region

Hi, Databricks CommunityI am currently plan to do migrate of existing metastore (not in a desired account and name) to a new one (a different desired account) within the same region. I understand it is not as straightforward it might be and complicat...

  • 2828 Views
  • 3 replies
  • 1 kudos
Latest Reply
Yulei
New Contributor III
  • 1 kudos

hi, @Kaniz . Thank for the recommendations and links, they are helpful and I am going through them one by one.

  • 1 kudos
2 More Replies
iplantevin
by New Contributor III
  • 725 Views
  • 0 replies
  • 0 kudos

Introducing PACE, the new open-source data security engine

Hi Databricks community! Exciting news: we open-sourced PACE (Policy As Code Engine) and launched on Product Hunt, and we'd love your input!PACE offers an abstraction on Unity Catalog data access policies, translating YAML/JSON policy definitions int...

Data Governance
Data Policies
Dynamic Views
  • 725 Views
  • 0 replies
  • 0 kudos
MiroFuoli
by New Contributor II
  • 3835 Views
  • 3 replies
  • 1 kudos

Unity Catalog - Limited Options for Connection Objects

I’m currently trying to create a Foreign Catalog based on a Connection object of type SQLSERVER. This would allow me to directly access our on-premises MS SQL database from within Azure Databricks using Unity Catalog.As I’m part of a large organizati...

Data Governance
Connection
Foreign Catalog
JDBC
SQL Server
Unity Catalog
  • 3835 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @MiroFuoli, Firstly, ensuring you’re using the correct versions of dbt-core and dbt-databricks is important. There have been instances where specific versions of these packages have caused issues. You might want to try using an older version, such...

  • 1 kudos
2 More Replies
Labels