Hi community,
When I use pyspark rdd related functions in my environment using databricks connect, I get below error:
Databricks cluster version: 12.2.
`RuntimeError: Python in worker has different version 3.9 than that in driver 3.10, PySpark cannot run with different minor versions. Please check environment variables PYSPARK PYTHON...`
How can I resolve it?