cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to force pandas_on_spark plots to use all dataframe data?

DavideCagnoni
Contributor

When I load a table as a `pandas_on_spark` dataframe, and try to e.g. scatterplot two columns, what I obtain is a subset of the desired points.

For example, if I try to plot two columns from a table with 1000000 rows, I only see some of the data - it looks like the first 1000, but maybe I am swayed from the spark dataframe behavior with the `display` function which states to be using only the first 1000 rows if the table has more.

Is it possible to either force the plot to show all the data, or to at least know how much data out of the total amount is being plot?

4 REPLIES 4

Anonymous
Not applicable

Hello, @Davide Cagnoniโ€‹ - It's nice to meet you! My name is Piper, and I'm a moderator for the community. Thank you for bringing this question to us. Let's give your peers a chance to respond and we'll come back if we need to.

DavideCagnoni
Contributor

@Kaniz Fatmaโ€‹ I need to use plotly in order to be able to interact with the graph (zoom in etc.) so this doesn't solve my problem...

User16255483290
Contributor

@Davide Cagnoniโ€‹ 

It's a limitation in data bricks notebooks it can't talk interactively with graphs.

DavideCagnoni
Contributor

@Kaniz Fatmaโ€‹  The problem is not about performance or plotly. It is about the pandas_on_spark dataframe arbitrarily subsampling the input data when plotting, without notifying the user about it.

While subsampling is comprehensible and maybe even necessary sometimes, at least a notification like the one present when you `display(table)` could be useful.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group