cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to get rid of "Command result size exceeds limit"

AmineHY
Contributor

I am working on Databricks Notebook and trying to display a map using Floium and I keep getting this error

> Command result size exceeds limit: Exceeded 20971520 bytes (current = 20973510)

How can I get increase the memory limit?

I already reduced the size of my pandas dataframe.

1 ACCEPTED SOLUTION

Accepted Solutions

daniel_sahal
Honored Contributor III

Notebook has got a default 20MB limit for outputs. There's nothing you can do about that.

https://kb.databricks.com/en_US/jobs/job-cluster-limit-nb-output

View solution in original post

4 REPLIES 4

daniel_sahal
Honored Contributor III

Notebook has got a default 20MB limit for outputs. There's nothing you can do about that.

https://kb.databricks.com/en_US/jobs/job-cluster-limit-nb-output

AmineHY
Contributor

Thanks, Daniel,

I have read this article, and I guess this problem occurs only while rendering the output, I'll try to save the result to disk in HTML format and visualize the results through the browser.

Have a good day.

Amine.

sher
Valued Contributor II

you can try with another module like geopandas / pygal

labromb
Contributor

Hi,

I have the same problem with keplergl, and the save to disk option, whilst helpful isn't super practical... So how does one plot large datasets in kepler?

Any thought welcome

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.