Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Showing results for 
Search instead for 
Did you mean: 

How to save a catalog table as a spark or pandas dataframe?

New Contributor


I have a table in my catalog, and I want to have it as a pandas or spark df. 

I was using this code to do that before, but I don't know what is happened recently that the code is not working anymore.


from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("test").getOrCreate() source_table = "my_path.test" df_spark =


Now, when I run it, I get this error:

Unsupported cell during execution. SQL warehouses only support executing SQL cells.

If I query the table using SQL, then how to store it as a df.

I used this line of code to do that but again I get the same error.


df_spark = spark.sql("SELECT * FROM my_path.test")



Community Manager
Community Manager

Hi @Pbr, To work around this, you can create a temporary view using SQL in a separate cell (e.g., a %%sql cell) and then reference that view from your Python or Scala code.

Here’s how you can achieve this:

  • First, create a temporary view for your table using SQL:

    SELECT * FROM my_path.test
  • Next, in your Python or Scala code, reference the temporary view to create a DataFrame:

    • In Scala:
      val df = spark.sql("SELECT * FROM my_temp_view")
    • In PySpark:
      df = spark.sql("SELECT * FROM my_temp_view")
  1. Example: Let’s assume you have already created the temporary view as shown above. You can then use the df DataFrame for further processing or save it to blob storage.

Remember to replace my_path.test with the actual path to your table.

If you encounter any issues, feel free to ask for further assistance! 😊

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.