Hi @leireroman, Databricks Runtime 16.4 LTS includes Delta Lake 3.3.1, paired with Spark 3.5.2. This combination works within Databricks because itโs a custom build. In your Conda environment, the conflict occurs because delta-spark 3.3.1 requires pyspark >=3.5.3, but youโve set it to 3.5.2.
To resolve this, you can either:
- Upgrade pyspark to 3.5.3 to work with delta-spark 3.3.1
- Downgrade to delta-spark 3.2.0 to stay compatible with Spark 3.5.2.