Hi @Rainer When you use Databricks Connect, your local code is executed against the Databricks cluster, which uses the Databricks Runtime’s PySpark, not your local PySpark installation. meaning your master driver node is also running on remote compute.I believe Databricks Runtime uses the open-source Apache Spark codebase, but it often includes patches, backports, and enhancements that are not yet released in the official open-source PySpark packages on PyPI. This is the reason DBR has a different flavour and optimization then open source pyspark and distinguish them with other spark providers for an example fabric.