โ08-03-2023 12:40 AM
I've recently started using Unity Catalog and I'm trying to set the default catalog name to something else than the hive_metastore for some of my workspaces.
According to the documentation (Update an assignment | Metastores API | REST API reference | Azure Databricks) the default catalog can be changed for a workspace by using the REST API.
I've tried this; but so far unsuccessfully. After running the "Update an assignment" request the value seems to be changed successfully, as you can see in the screenshot below.
Yet when I execute a query from a notebook within this workspace it still seems to refer to the hive_metastore when I don't specify the catalog name.
Any ideas on what could be causing this are more than welcome!
โ08-08-2023 09:00 AM
I would recommend to use the Spark conf at a cluster level to change the default Catalog. For the rest API i believe it works only for SQL Warehouse.
3 weeks ago
I found that setting the default catalog in the workspace "Admin Settings" works for Sql warehouse, spark cluster and compute polices.
Consult this documentation : https://docs.databricks.com/en/data-governance/unity-catalog/create-catalogs.html#view-the-current-d...
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.