Hello Alberto,
Thanks for the prompt response!! I tries your suggestion of checking right before dropping the view and more specifically i did this:
def create_df_view():
df = available_dfs['input_df']
# uuid = uuid4().hex
temp_view_name = 'al_rammos_view'
df.createOrReplaceTempView (temp_view_name)
df_output = spark.sql( # type: ignore
GROUPING_SET_QUERY.format(
temp_view_name=temp_view_name,
)
)
# Print the current catalog and schema to verify they're correct
current_views = [t.name for t in spark.catalog.listTables()]
print("Current views:", current_views)
available_dfs['output_data'] = df_output
if temp_view_name in current_views:
print('im here')
spark.catalog.dropTempView(temp_view_name)
print("View dropped")
else:
print(f"View {temp_view_name} not found; skipping DROP.")
return df_output
but when i displayed the result from the function (spawned the action) i have the exact same error:
df_output_trial = create_df_view()
display(df_output_trial)
Current views: ['al_rammos_view']
im here
View dropped
[TABLE_OR_VIEW_NOT_FOUND] The table or view `al_rammos_view` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01
Unfortunately, this hasn’t resolved the crashes. In my tests, the temp view still appears to become inaccessible before we run the check, leading to the same TABLE_OR_VIEW_NOT_FOUND errors. The behavior is effectively unchanged from before.
Are there any other known workarounds or fixes for the incorrect session mapping with local temp views in the newer Databricks runtimes? Or when this bug will be fixed? Please let me know if there’s anything else I can try.
Kind Regards