PERMISSION_DENIED: Cannot access Spark Connect. when trying to run serverless databricks connect
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-25-2025 12:24 PM
I am not able to run a file as "run as workflow" nor "run with databricks connect" when I choose serverless run on my paid account. However I can perform this action in my free edition account .
See error :
pyspark.errors.exceptions.connect.SparkConnectGrpcException: PERMISSION_DENIED: Cannot access Spark Connect. (requestId=e6437897-ac03-4c06-b7a6-5b25ee676d03)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-26-2025 05:11 PM
I'm having the same problem. I'm admin on both the account and the workspace. I get this with `databricks-connect test` too. I've tried updating my databricks-connect version to `==16.4.*` which seems to be the latest version with serverless support.
One thing I have noticed is that even though I can use serverless in the web UI for notebooks and SQL warehouse, I don't see the serverless option under feature enablement in the account settings. I suspect my account is misconfigured so I'm contacting support about that.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-02-2025 11:23 AM
I think adding this to my databricks.yml file helped. I probably did have a permissions problem and just granting these permissions over the whole of the bundle solved my issue.
permissions:
- group_name: my-admin-group
level: CAN_MANAGE
- group_name: my-user-group
level: CAN_VIEW
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-28-2025 06:25 AM
Hi @ivan7256 ,
This might be because serverless compute isn't enabled for workflows in your paid workspace.