cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks autocomplete uses hive_metastore catalog although we have other default catalog

Karlo_Kotarac
New Contributor III

Databricks autocomplete is checking the hive_metastore catalog when I enter the schema_name and table_name in the notebook although we have other default catalog set at the workspace level. How to make Databricks autocomplete use that catalog when entering table name without specifying catalog_name?

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @Karlo_KotaracWhen using Databricks, the autocomplete feature often defaults to the hive_metastore catalog when you enter schema and table names in a notebook.

However, if you have another default catalog set at the workspace level and want to use that instead, you can follow these steps:

  1. Per Notebook Configuration:

    • You can set the default catalog for a specific notebook by adding the following configuration to the notebookโ€™s code:
      %sql
      SET spark.databricks.sql.initial.catalog.name = "your_catalog_name"
      

    Replace "your_catalog_name" with the actual name of your desired catalog.

  2. Workspace-Level Configuration:

    • To set the default catalog for the entire workspace, you can configure it at the cluster level. Add the following to your clusterโ€™s Spark configuration:
      spark.databricks.sql.initial.catalog.name = "your_catalog_name"
      

    Again, replace "your_catalog_name" with the appropriate catalog name.

  3. Using SQL Commands:

    • You can explicitly set the current catalog within a notebook using the USE CATALOG command. For example:
      USE CATALOG your_catalog_name;
      

    This will ensure that unqualified table names are resolved from the specified catalog.

Remember to replace "your_catalog_name" with the actual name of the catalog you want to use. These steps should help you override the default behavior and work with the desired catalog when ent...123.

Happy querying! ๐Ÿš€

 

Karlo_Kotarac
New Contributor III

Hi @Kaniz_Fatma ! Thanks for your answer. I forgot to mention that we already have this set up at the cluster level (using spark.databricks.sql.initial.catalog.name variable) besides setting this on the workspace level in the workspace settings but none of these things helped. Autocomplete always looks at the hive_metastore catalog if we don't specify the name of the catalog. However, if I use 'USE CATALOG' command first in the notebook then autocomplete works fine, but I was hoping it should work without that.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group