Spark Remote error when connecting to cluster
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-24-2023 02:46 AM
Hi, I am using the latest version of pyspark and I am trying to connect to a remote cluster with runtime 13.3.
My doubts are:
- Do i need databricks unity catalog enabled?
- My cluster is already in a Shared policy in Access Mode, so what other configuration is missing?
Error callstack:
status = StatusCode.FAILED_PRECONDITION
details = "INVALID_STATE: cluster xxx-xxx-xxx is not Shared or Single User Cluster.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-28-2023 05:26 AM
Hi, Is your workspace is already unity catalog enabled? Also, did you go through the considerations for enabling workspace for unity catalog?
Please let us know if this helps, also, please tag @Debayan with your next comment which will notify me. Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-30-2024 01:32 AM - edited 10-30-2024 01:33 AM
Do I need unity catalog to use databricks connect?

