Hi @Retired_mod ! Thanks for your answer. I forgot to mention that we already have this set up at the cluster level (using spark.databricks.sql.initial.catalog.name variable) besides setting this on the workspace level in the workspace settings but none of these things helped. Autocomplete always looks at the hive_metastore catalog if we don't specify the name of the catalog. However, if I use 'USE CATALOG' command first in the notebook then autocomplete works fine, but I was hoping it should work without that.