cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

DROP VIEW IF EXISTS Failing on Dynamically Generated Temporary View in Databricks 15.4 LTS

al_rammos
New Contributor II

Hello everyone,

I'm experiencing a very strange issue with temporary views in Databricks 15.4 LTS that did not occur in 13.3. I have a workflow where I create a temporary view, run a query against it, and then drop it using a DROP VIEW IF EXISTS command. The issue is as follows:

Issue Description

  • I create a temporary view using a dynamically generated name:
temp_view_name = f"{self.input_data}_{uuid4().hex}" df.createOrReplaceTempView(temp_view_name)
  • I then run a query against the view (which works fine), and afterwards, I attempt to drop the view using an f-string:

     
self.spark.sql(f"DROP VIEW IF EXISTS {temp_view_name}")
  • However, this DROP statement fails with an error:

     
[TABLE_OR_VIEW_NOT_FOUND] The table or view `input_df_<uuid>` cannot be found. Verify the spelling and correctness of the schema and catalog. ...
  • Curiously, if I hardcode the entire DROP statement as a literal string:

self.spark.sql('''
DROP VIEW IF EXISTS input_df_5bf8576e20da4c49a986ed81428ea839
''')

the DROP works as expected.

What I've Tried

  • Quoting the Identifier:
    I tried wrapping the temporary view name in backticks:

    python
     
self.spark.sql(f"DROP VIEW IF EXISTS `{temp_view_name}`")

but the error persists.

  • Using the Spark Catalog API:
    I also attempted:

self.spark.catalog.dropTempView(temp_view_name)

which yields the same TABLE_OR_VIEW_NOT_FOUND error.

  • Verifying the View's Existence:
    Right after creating the view, I list all session tables with:

print([t.name for t in self.spark.catalog.listTables()])

and the temporary view is present in the list.

 

Environment Details

  • Working Runtime: Databricks 13.3 LTS (no issues with DROP VIEW IF EXISTS using dynamic names).

  • Problematic Runtime: Databricks 15.4 LTS.

  • View Creation: The temporary view is created successfully and is visible when listing tables.

  • DROP Behavior: When dropping using a literal SQL string, it works; when dropping using a dynamic f-string (or the catalog API), it fails with TABLE_OR_VIEW_NOT_FOUND.

Questions

  1. Has anyone encountered this discrepancy in 15.4 LTS regarding dynamic temporary view drop behavior?

  2. Could this be a bug or a change in how the session catalog resolves temporary views in 15.4 LTS?

  3. Are there any recommended workarounds or settings to ensure that DROP VIEW IF EXISTS works with dynamically generated temporary view names?

2 REPLIES 2

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @al_rammos,

Thanks for your detail comments and replication of the issue. There have been known issues in recent DBR versions where dynamically created temporary views are not being properly resolved during certain operations due to incorrect session mapping. For example, display commands or interactive queries sometimes reference a placeholder instead of the user's actual session. This misalignment means temporary views may not be accessible from the session where the view was created. I will check internally if there is any case open for this.

Could you please try to see if it works:

if temp_view_name in [v.name for v in self.spark.catalog.listTables()]:
self.spark.catalog.dropTempView(temp_view_name)

Hello Alberto,

Thanks for the prompt response!!  I tries your suggestion of checking right before dropping the view and more specifically i did this:

def create_df_view():
    df = available_dfs['input_df']
    # uuid = uuid4().hex
    temp_view_name = 'al_rammos_view'
    df.createOrReplaceTempView (temp_view_name)
            
    df_output = spark.sql(  # type: ignore
        GROUPING_SET_QUERY.format(
            temp_view_name=temp_view_name,
        )
    )

    # Print the current catalog and schema to verify they're correct
    current_views = [t.name for t in spark.catalog.listTables()]
    print("Current views:", current_views)
    available_dfs['output_data'] = df_output
    if temp_view_name in current_views:
        print('im here')
        spark.catalog.dropTempView(temp_view_name)
        print("View dropped")
    else:
        print(f"View {temp_view_name} not found; skipping DROP.")
    return df_output

 but when i displayed the result from the function (spawned the action) i have the exact same error:

df_output_trial = create_df_view()
display(df_output_trial)
Current views: ['al_rammos_view']
im here
View dropped
[TABLE_OR_VIEW_NOT_FOUND] The table or view `al_rammos_view` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01

Unfortunately, this hasn’t resolved the crashes. In my tests, the temp view still appears to become inaccessible before we run the check, leading to the same TABLE_OR_VIEW_NOT_FOUND errors. The behavior is effectively unchanged from before.

Are there any other known workarounds or fixes for the incorrect session mapping with local temp views in the newer Databricks runtimes? Or when this bug will be fixed? Please let me know if there’s anything else I can try.

Kind Regards

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now