Databricks single user compute cannot write to storage
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-06-2025 08:07 PM
I've deployed unrestricted single user compute for each developer in our dev workspace and everything works fine except for writing to storage where the cell will continuously run but seemingly not execute anything. If I switch to an unrestricted shared compute with the same runtime (16.1) then it works fine. Workspace is UC-enabled and writes to an ADLS Gen2 account. What could be the issue here?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 04:40 AM
Hi @jar,
Single user mode takes the credentials of the owner of the cluster, do the users writing to the ADLS storage have required permissions to the storage?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2025 05:53 PM
Hi @Alberto_Umana, and thank you for your reply. Yes, they have the Storage Blob Data Contributor role on the storage account.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-08-2025 04:37 AM
Hi @jar,
Thanks! How have you configured the ADLS with UC? as an external location, if so the users have permissions on it in Unity Catalog?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-08-2025 08:57 PM
Hi @Alberto_Umana.
Yes, as an external location. I myself cannot use the single user cluster to write data to storage (can easily read it though) and I have all privileges on the external location (container) I am trying to write to (including storage blob data contributor to the ADLS account).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-09-2025 05:12 AM
Hi @jar,
Thanks for your comments, so using the single mode cluster you are able to read the data on the external location but not write to it? also what are the permissions on Unity Catalog on the catalog.schema.table?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-12-2025 08:11 PM
Hi @Alberto_Umana
That is correct. I don't think UC-permissions on the table are the issue, as all operations work perfectly fine on a non-single user cluster. Also, I'm the admin so I have all permissions on the table.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2025 02:19 AM
1) check runas of your workflow as job will consider user/sp defined in runas of your workflow. This should have all necessary permissions.
2) if this table is already added to catalog.schema.table in UC then this user/sp should have necessary permissions there as well as I have seen this supercede external location permissions.
3) if point 2 is not valid then provide read file and write file to this user/sp on external location in UC.
give a try..
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-13-2025 04:35 AM
Adding to @saurabh18cs comments, also check if any instance profile attached to the cluster.
What is the difference between the clusters, only access mode?

