Due to some problems, I need to set Py4JSecurity to false, however I get the following message:
"spark.databricks.pyspark.enablePy4JSecurity is not allowed when choosing an access mode"
My cluster is in Shared access mode and a runtime version 12.1 (includes Apache Spark 3.3.1, Scala 2.12) with Unity catalog, Worker type Standard_DS4_v2, and driver type Standard_DS3_v2.
I need it to be a shared cluster that uses unity catalog, so is there any way to get around this?
Thanks!