Issue when connecting with Databricks cluster 15.4 without unity catalog using databricks connect
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2025 08:03 AM
Hi,
I have a shared cluster created on databricks which uses 15.4 runtime.
I dont want to enable the unity catalog for this cluster.
Previously I used python 3.9.13 version to connect to 11.3 cluster using databricks connect 11.3
Now my company has restricted python 3.9 and we must use python 3.12.
I have installed databricks-connect 15.4 and using python 3.12.
when i do databricks-connect test , I get the below error:
pyspark.errors.exceptions.connect.SparkConnectGrpcException: BAD_REQUEST: SingleClusterComputeMode(xxx-xxxx-xxxxx) is not Shared or Single User Cluster. (requestId=dcd4fa22-5cf5-4a42-9e52-c03058f7c32b)
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1739548060.674965 22836 init.cc:232] grpc_wait_for_shutdown_with_timeout() timed out.
Please help me to fix this error or alternative solution to work with python 3.12 and databricks-connect using vs code without unity catalog enabled
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2025 08:11 AM
Hi @prasidataengine,
For DBR runtime 13.3 LTS and above you must have Unity Catalog enabled to be able to use databricks-connect.
-
A Databricks account and workspace that have Unity Catalog enabled. See Set up and manage Unity Catalog and Enable a workspace for Unity Catalog.
https://docs.databricks.com/en/dev-tools/databricks-connect/cluster-config.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2025 10:24 AM
Thanks for the reply. is there any other option available?

