cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks SQL: catalog of each query

noimeta
Contributor II

Currently, we are migrating from hive metastore to UC.

We have several dashboards and a huge number of queries whose catalogs have been set to hive_metastore and using <db>.<table> access pattern.

I'm just wondering if there's a way to switch catalogs of all queries in one go? Switching catalog of each query one by one needs tons of work. Also, it would be best if we can switch back and forth, so that we can make sure that everything looks correct.

1 ACCEPTED SOLUTION

Accepted Solutions

NateAnth
Valued Contributor
Valued Contributor

Please see the link here.

This would be the setting to set on non-SQL Warehouse Databricks Clusters

spark.databricks.sql.initial.catalog.name

To change for the all of the SQL Warehouses in a workspace you would use:

databricks unity-catalog metastores assign --workspace-id 1234567890123456 \
                                       --metastore-id 12a345b6-7890-1cd2-3456-e789f0a12b34 \
                                       --default-catalog-name my_catalog

View solution in original post

7 REPLIES 7

AdrianLobacz
Contributor

Hi, check this link, its documentation about UC and hive metastore

https://docs.databricks.com/data-governance/unity-catalog/hive-metastore.html#language-sql

Thank you. I did check the link.

I know that we can explicitly set the catalog name in a query. Anyway, my point is I don't want to manually edit each query one by one.

youssefmrini
Honored Contributor III
Honored Contributor III

There is a configuration that you can use to define the default catalog and you can have as many sql endpoints as catalogs. It will makes your life much easier.

That sounds very promising.

Could you please pin point how to do that? Is there any document that I can follow?

NateAnth
Valued Contributor
Valued Contributor

Please see the link here.

This would be the setting to set on non-SQL Warehouse Databricks Clusters

spark.databricks.sql.initial.catalog.name

To change for the all of the SQL Warehouses in a workspace you would use:

databricks unity-catalog metastores assign --workspace-id 1234567890123456 \
                                       --metastore-id 12a345b6-7890-1cd2-3456-e789f0a12b34 \
                                       --default-catalog-name my_catalog

Thank you so much. The unity-catalog CLI is working for me

abdulrahim
New Contributor II

Absolutely accurate, in order to grow your business you need to create an image of your brand such that it is the first thing coming to customers mind when they think about a certain product or service that’s where social media marketing agencies come in hand.

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.