05-29-2024 01:06 PM
I noticed that on some Databricks 14.3 clusters, I get DataFrames with type pyspark.sql.connect.dataframe.DataFrame, while on other clusters also with Databricks 14.3, the exact same code gets DataFrames of type pyspark.sql.DataFrame
pyspark.sql.connect.dataframe.DataFrame seems to be causing various issues.
for example:
To help investigate, I would like to know:
05-31-2024 07:13 AM
Hi @Retired_mod,
That's incorrect. I use exactly the same code and either get a pyspark.sql.dataframe.DataFrame, or pyspark.sql.connect.dataframe.DataFrame depending on the cluster. It doesn't matter if I create the dataframe using spark.read.table, spark.sql, or even spark.createDataFrame for in-memory data, what changes the class I will get is the cluster configuration.
This screenshot illustrates what I mean. I ran the same notebook on two different clusters and will get a different DataFrame type depending on the cluster. The only difference I can see between the two clusters is that one is a single-user cluster, and the other one is a shared (multi-user) cluster. Both clusters use Databricks 14.3.
So the choice for the class to use is an internal implementation decision by Databricks, and the question is what leads Databricks to pick one or another class, and, considering that they don't appear to be 100% interchangeable, what are the limitations?
Also note that both classes have methods like select, filter, groupBy, cache, persist that can be used the same way with both classes. Both can also be used to run SQL queries or directly read a table without using a query.
08-28-2024 12:54 PM
What makes the difference is whether the cluster is using Spark Connect or not.
Shared clusters are using Spark Connect, so even the spark session is of different type:
To compare on single user cluster:
What I tested is that you can disable Spark Connect on the cluster by setting spark.databricks.service.server.enabled to false, but in this case everything stops working:
08-06-2024 11:35 PM
@Retired_mod @ckarrasexo Any updates on this? I'm facing the same issue
08-27-2024 03:37 PM
@Retired_mod I am also running into this issue, also with Great Expectations as it happens. I have also tried using the read paquert like you suggested and am still getting the problematic format. Is it possible to direct Databricks to create one type, or convert or cast between them?
08-27-2024 03:59 PM
Additional info. In Databricks 13.3, the spark variable we're provided is of type pyspark.sql.SparkSession. In 15.4 it is created as pyspark.sql.connect.session.SparkSession (both shared clusters; it may behave differently for single node configuration).
09-17-2024 01:24 AM - edited 09-17-2024 01:37 AM
Hitting the same problems trying to check the type of variables to pick out DataFrames.
Ended up getting around this (temporarily at least) by importing the following instead:
10-08-2024 03:10 AM
@ckarrasexo wrote:I noticed that on some Databricks 14.3 clusters, I get DataFrames with type pyspark.sql.connect.dataframe.DataFrame, while on other clusters also with Databricks 14.3, the exact same code gets DataFrames of type pyspark.sql.DataFrame
pyspark.sql.connect.dataframe.DataFrame seems to be causing various issues.
for example:
- Code that checks for isinstance(df, DataFrame) does not recognize df to be a DataFrame, even though pyspark.sql.connect.dataframe.DataFrame inherits from pyspark.sql.DataFrame
- I get this error with pyspark.sql.connect.dataframe.DataFrame and a third-party library (Great Expectations), but not with pyspark.sql.connect.DataFrame [CANNOT_RESOLVE_DATAFRAME_COLUMN] Cannot resolve dataframe column "<column name>". It's probably because of illegal references like `df1.select(df2.col("a"))`. SQLSTATE: 42704
To help investigate, I would like to know:
- What is the difference between pyspark.sql.connect.dataframe.DataFrame and pyspark.sql.DataFrame?
- What determines if I will get one type of DataFrame or the other?
- Does pyspark.sql.connect.dataframe.DataFrame have limitations that would lead the issues I have to be expected?
Inconsistencies in my Databricks 14.3 where some clusters return DataFrames as pyspark.sql.connect.dataframe.DataFrame, while others return pyspark.sql.DataFrame. This affects type checking with isinstance(df, DataFrame), and I'm facing errors with Great Expectations, specifically "CANNOT_RESOLVE_DATAFRAME_COLUMN." Has anyone else dealt with this issue, and what solutions did you find?
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group