Is there a solution to the above problem? I also would like to restart SparkSession to free my cluster's resources, but when calling
spark.stop()
the notebook automatically detach and the following error occurs:
The spark context has stopped and the driver is restarting. Your notebook will be automatically reattached.
Is there a recommended way to restart SparkSession?