cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Job cluster has no permission to create folder in Unity Catalog Volume

Johannes_E
New Contributor III

Hello everybody,

I want to run a job that collects some csv files from a SFTP server and saves them on my Unity Catalog Volume. While my personal cluster defined like the following has access to create folders on the volume my job cluster doesn't.

Definition of my own cluster:

Johannes_E_0-1752829739526.png

Definition of my job cluster:

      job_clusters:
        - job_cluster_key: my_job_cluster
          new_cluster:
            cluster_name: ""
            spark_version: 16.2.x-scala2.12
            azure_attributes:
              first_on_demand: 1
              availability: SPOT_WITH_FALLBACK_AZURE
              spot_bid_max_price: 100
            node_type_id: Standard_DS3_v2
            enable_elastic_disk: true
            data_security_mode: NONE
            runtime_engine: STANDARD
            autoscale:
              min_workers: 1
              max_workers: 2

First I ran the job with a Service Principal ("run as"). After changing "run as" to my own user the job cluster still could not create a folder on the volume. 

Regarding the volume I've granted the following permissions:

Johannes_E_1-1752829991980.png

So how can I grant the job cluster the permission to create folders on the volume? Unfortunately, there is no way to set cluster permissions or something similar in Databricks Asset Bundles as far as I could find out.

Thank you for every answer! I appreciate it.

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Esteemed Contributor III

Hi @Johannes_E ,

I think the problem could be related to how you've configured data_security_mode in your cluster definition. Try to use "DATA_SECURITY_MODE_DEDICATED".

View solution in original post

2 REPLIES 2

szymon_dybczak
Esteemed Contributor III

Hi @Johannes_E ,

I think the problem could be related to how you've configured data_security_mode in your cluster definition. Try to use "DATA_SECURITY_MODE_DEDICATED".

Johannes_E
New Contributor III

Thank you, that helped 🙂 although I had to use "SINGLE_USER" instead of "DATA_SECURITY_MODE_DEDICATED". According to the docs (https://docs.databricks.com/api/workspace/clusters/create) "SINGLE_USER" is an alias for "DATA_SECURITY_MODE_DEDICATED".