-werners-
Esteemed Contributor III

you can always write a df to persistent storage.

Just use spark.write.parquet (or whatever format you choose).

You can create a table view on top of the parquet file too if necessary and run sql queries on that (by using databricks notebooks or by odbc connection).