Hi Community,
I have setup a jupyter notebook in a server and installed databricks connect in its kernel to leverage my databricks cluster compute in the notebook and write pyspark code.
Whenever I run my code it gives me below warning:
```WARN SparkClientManager: DBConnect client for session <session_id> has been closed as the client cache reached the maximum size: 20. You can change the cache size by changing the conf value for spark.databricks.service.client.session.cache.size```
Is this a concerning warning? And what does it mean?