Heard of that SparkSession already reside in Spark Cluster which is Databricks compute resource. Instea of initializing that object, you can utilize that variable "spark" which is already in Cluster out there. It can be re-configured with othere se...