- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-11-2023 03:14 AM
I am working on Databricks Notebook and trying to display a map using Floium and I keep getting this error
> Command result size exceeds limit: Exceeded 20971520 bytes (current = 20973510)
How can I get increase the memory limit?
I already reduced the size of my pandas dataframe.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-11-2023 03:46 AM
Notebook has got a default 20MB limit for outputs. There's nothing you can do about that.
https://kb.databricks.com/en_US/jobs/job-cluster-limit-nb-output
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-11-2023 03:46 AM
Notebook has got a default 20MB limit for outputs. There's nothing you can do about that.
https://kb.databricks.com/en_US/jobs/job-cluster-limit-nb-output
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-11-2023 04:00 AM
Thanks, Daniel,
I have read this article, and I guess this problem occurs only while rendering the output, I'll try to save the result to disk in HTML format and visualize the results through the browser.
Have a good day.
Amine.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-11-2023 09:27 AM
you can try with another module like geopandas / pygal
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-09-2023 01:26 AM
Hi,
I have the same problem with keplergl, and the save to disk option, whilst helpful isn't super practical... So how does one plot large datasets in kepler?
Any thought welcome