cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to save a catalog table as a spark or pandas dataframe?

Pbr
New Contributor

Hello

I have a table in my catalog, and I want to have it as a pandas or spark df. 

I was using this code to do that before, but I don't know what is happened recently that the code is not working anymore.

 

from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("test").getOrCreate() source_table = "my_path.test" df_spark = spark.read.table(source_table)

 

Now, when I run it, I get this error:

Unsupported cell during execution. SQL warehouses only support executing SQL cells.

If I query the table using SQL, then how to store it as a df.

I used this line of code to do that but again I get the same error.

 

df_spark = spark.sql("SELECT * FROM my_path.test")

 

0 REPLIES 0

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now