- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2024 04:00 AM
In workspace settings: Workspace admin - advanced - other - Default catalog for the workspace is set to different than hive_metastore, it is set to a `Unity Catalog` catalog - the expected behaviour is copied here from the related more info panel:
"Setting the default catalog for the workspace determines the catalog that is used when queries do not reference a fully qualified 3 level name. For example, if the default catalog is set to 'retail_prod' then a query 'SELECT * FROM myTable' would reference the object 'retail_prod.default.myTable' (the schema 'default' is always assumed)."
In case of hive_metastore it is worked as expected when a new SQL Editor tab is opened. In case of Unity Catalog the new tab contains the Unity Catalog as the default catalog (as expected), but the default schema is not 'default', the first schema in the catalog in alphabetical order is the default - in my case it is called 'archive'.
Is there something I missed here - is there a setting I have to use for enforcing 'default' schema for being default in SQL Editor?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2024 06:35 AM
Hi @gyorgyjelinek,
It appears that the behavior you are experiencing is due to the default schema selection mechanism in Unity Catalog. When a new SQL Editor tab is opened, the default catalog is correctly set to the Unity Catalog you specified. However, instead of defaulting to the 'default' schema, it defaults to the first schema in alphabetical order, which in your case is 'archive'.
This behavior is consistent with the current implementation of Unity Catalog in Databricks. Unfortunately, there is no direct setting available to enforce the 'default' schema as the default schema in the SQL Editor when using Unity Catalog. The system defaults to the first schema in alphabetical order if no specific schema is mentioned.
To work around this, you can explicitly specify the schema in your queries or set the schema at the beginning of your SQL sessions using the `USE SCHEMA` command. For example:
USE SCHEMA default;
This command can be run at the start of your SQL session to ensure that the 'default' schema is used for subsequent queries.
Or simply use full path to the table for example:
SELECT * FROM catalog.schema.table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2024 06:35 AM
Hi @gyorgyjelinek,
It appears that the behavior you are experiencing is due to the default schema selection mechanism in Unity Catalog. When a new SQL Editor tab is opened, the default catalog is correctly set to the Unity Catalog you specified. However, instead of defaulting to the 'default' schema, it defaults to the first schema in alphabetical order, which in your case is 'archive'.
This behavior is consistent with the current implementation of Unity Catalog in Databricks. Unfortunately, there is no direct setting available to enforce the 'default' schema as the default schema in the SQL Editor when using Unity Catalog. The system defaults to the first schema in alphabetical order if no specific schema is mentioned.
To work around this, you can explicitly specify the schema in your queries or set the schema at the beginning of your SQL sessions using the `USE SCHEMA` command. For example:
USE SCHEMA default;
This command can be run at the start of your SQL session to ensure that the 'default' schema is used for subsequent queries.
Or simply use full path to the table for example:
SELECT * FROM catalog.schema.table
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-20-2024 08:42 AM
Hi @Alberto_Umana ,
Thank you for the explanation. I mark your comment as the accepted solution as it contains the current implementation logic and the work around. Good to know that the more info panel is a bit misleading as of now because the SQL Editor tool is an exception.

