Hello everybody,
I want to run a job that collects some csv files from a SFTP server and saves them on my Unity Catalog Volume. While my personal cluster defined like the following has access to create folders on the volume my job cluster doesn't.
Definition of my own cluster:

Definition of my job cluster:
job_clusters:
- job_cluster_key: my_job_cluster
new_cluster:
cluster_name: ""
spark_version: 16.2.x-scala2.12
azure_attributes:
first_on_demand: 1
availability: SPOT_WITH_FALLBACK_AZURE
spot_bid_max_price: 100
node_type_id: Standard_DS3_v2
enable_elastic_disk: true
data_security_mode: NONE
runtime_engine: STANDARD
autoscale:
min_workers: 1
max_workers: 2
First I ran the job with a Service Principal ("run as"). After changing "run as" to my own user the job cluster still could not create a folder on the volume.
Regarding the volume I've granted the following permissions:

So how can I grant the job cluster the permission to create folders on the volume? Unfortunately, there is no way to set cluster permissions or something similar in Databricks Asset Bundles as far as I could find out.
Thank you for every answer! I appreciate it.