I have found that when the cluster is shared, it automatically uses that type of session, and in that case, I have not been able to disable it. I don't know if this is your situation. I have avoided some problems that I had with the previous clause.
Maybe you can do something like:from pyspark.sql import SparkSessionspark = SparkSession.builder.getOrCreate()if spark.__class__.__module__ == "pyspark.sql.connect.session": code when it is enableelse: code when not