cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks autocomplete uses hive_metastore catalog although we have other default catalog

Karlo_Kotarac
New Contributor II

Databricks autocomplete is checking the hive_metastore catalog when I enter the schema_name and table_name in the notebook although we have other default catalog set at the workspace level. How to make Databricks autocomplete use that catalog when entering table name without specifying catalog_name?

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @Karlo_KotaracWhen using Databricks, the autocomplete feature often defaults to the hive_metastore catalog when you enter schema and table names in a notebook.

However, if you have another default catalog set at the workspace level and want to use that instead, you can follow these steps:

  1. Per Notebook Configuration:

    • You can set the default catalog for a specific notebook by adding the following configuration to the notebook’s code:
      %sql
      SET spark.databricks.sql.initial.catalog.name = "your_catalog_name"
      

    Replace "your_catalog_name" with the actual name of your desired catalog.

  2. Workspace-Level Configuration:

    • To set the default catalog for the entire workspace, you can configure it at the cluster level. Add the following to your cluster’s Spark configuration:
      spark.databricks.sql.initial.catalog.name = "your_catalog_name"
      

    Again, replace "your_catalog_name" with the appropriate catalog name.

  3. Using SQL Commands:

    • You can explicitly set the current catalog within a notebook using the USE CATALOG command. For example:
      USE CATALOG your_catalog_name;
      

    This will ensure that unqualified table names are resolved from the specified catalog.

Remember to replace "your_catalog_name" with the actual name of the catalog you want to use. These steps should help you override the default behavior and work with the desired catalog when ent...123.

Happy querying! 🚀

 

Karlo_Kotarac
New Contributor II

Hi @Kaniz ! Thanks for your answer. I forgot to mention that we already have this set up at the cluster level (using spark.databricks.sql.initial.catalog.name variable) besides setting this on the workspace level in the workspace settings but none of these things helped. Autocomplete always looks at the hive_metastore catalog if we don't specify the name of the catalog. However, if I use 'USE CATALOG' command first in the notebook then autocomplete works fine, but I was hoping it should work without that.