Hi, seeing this on all new clusters (single or multi-node) I am creating. As soon as the metrics start showing up, the memory consumption shows 90% already consumed between Used and Cached (something like below). This is the case with higher or lower memory clusters, the % consumed increases with total memory. The cluster is brand new and has not been used and no library has been installed yet. Restarting does not change much either.
The problem is, when I did attach a notebook to one of the new clusters, after the first run the cells would just hang on execution and Metrics would show 99%+ consumed. The cluster would effectively become useless.
I have tried all of the suggestions I could find such as, spark.catalog.clearCache, sqlContext.clearCache, spark.sql("CLEAR CACHE"), NukeAllCaching method etc, without any benefit.
Please advice what am I missing to setup the cluster correctly.