While using `databricks-sdk` in my code, I've found that checking PySpark objects types is not reliable anymore. I've used to do the following: from pyspark.sql import Column, DataFrame, SparkSession
isinstance(spark, SparkSession)
isinstance(a_df...