Hi @ArturSvoboda
This sounds like a classic Databricks workspace isolation issue. Here are the most likely causes and things to check:
1. Temporary Views vs Persistent Views
Your renamed views might be temporary views that only exist in your current Spark session:
- Temporary views are created with CREATE OR REPLACE TEMPORARY VIEW and only exist in your session
- Global temporary views are created with CREATE GLOBAL TEMPORARY VIEW and exist across sessions but in a special global_temp database
- Persistent views are created with CREATE VIEW and are stored in the metastore.
2. Location/Namespace Issues: Check where these views actually exist:
-- Try these commands to locate your views:
SHOW VIEWS IN global_temp;
SHOW TABLES IN global_temp;
DESCRIBE EXTENDED your_view_name;
3. Three-Part Naming
In Unity Catalog, you need the full three-part name:
-- Instead of just:
SELECT * FROM your_view
-- Use:
SELECT * FROM catalog.schema.your_view
4. Session vs Metastore Storage
- If SELECT * works but SHOW VIEWS doesn't show them, they're likely temporary views in your session.
- The _sqldf you're seeing suggests these might be pandas/SQL dataframe temporary views.
-- Check current database
SELECT current_database();
-- Look for temp views
SHOW VIEWS IN global_temp;
-- Check if they're in a different schema
SHOW SCHEMAS;
Quick diagnostic:
-- Check current database
SELECT current_database();
-- Look for temp views
SHOW VIEWS IN global_temp;
-- Check if they're in a different schema
SHOW SCHEMAS;
Metastore type confusion:
You mentioned hive.metastore - are you mixing Hive metastore and Unity Catalog? Views might exist in one but not the other.
Can you try:
Running DESCRIBE EXTENDED view_name on one of your views
Sharing the exact error message you get when trying to JOIN
Confirming which metastore/catalog you're working in
LR