cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Getting client.session.cache.size warning in pyspark code using databricks connect

Surajv
New Contributor III

Hi Community, 

I have setup a jupyter notebook in a server and installed databricks connect in its kernel to leverage my databricks cluster compute in the notebook and write pyspark code. 

Whenever I run my code it gives me below warning: 

```WARN SparkClientManager: DBConnect client for session <session_id> has been closed as the client cache reached the maximum size: 20. You can change the cache size by changing the conf value for spark.databricks.service.client.session.cache.size```

Is this a concerning warning? And what does it mean? 

2 REPLIES 2

Riyakh
New Contributor II

The warning indicates that the client cache (used to manage connections between your local environment and the Databricks cluster) has reached its maximum size (20 sessions). When this limit is reached, the oldest session is closed to make room for a new one.

As Suggested - spark.databricks.service.client.session.cache.size to increase the cache size

While this warning itself is not critical.  If you frequently open and close sessions, you may encounter performance issues due to cache management.

Surajv
New Contributor III

Thank you @Riyakh 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group