Hi, I am using the latest version of pyspark and I am trying to connect to a remote cluster with runtime 13.3.
My doubts are:
- Do i need databricks unity catalog enabled?
- My cluster is already in a Shared policy in Access Mode, so what other configuration is missing?
Error callstack:
status = StatusCode.FAILED_PRECONDITION
details = "INVALID_STATE: cluster xxx-xxx-xxx is not Shared or Single User Cluster.