cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

New Cluster 90% memory already consumed

AbhishekNegi
New Contributor

Hi, seeing this on all new clusters (single or multi-node) I am creating. As soon as the metrics start showing up, the memory consumption shows 90% already consumed between Used and Cached (something like below). This is the case with higher or lower memory clusters, the % consumed increases with total memory. The cluster is brand new and has not been used and no library has been installed yet. Restarting does not change much either.

AbhishekNegi_0-1725911074420.png

AbhishekNegi_1-1725911119189.png

The problem is, when I did attach a notebook to one of the new clusters, after the first run the cells would just hang on execution and Metrics would show 99%+ consumed. The cluster would effectively become useless.

I have tried all of the suggestions I could find such as, spark.catalog.clearCache, sqlContext.clearCache, spark.sql("CLEAR CACHE"), NukeAllCaching method etc, without any benefit.

Please advice what am I missing to setup the cluster correctly.

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group