โ06-30-2025 07:29 AM
Hello all,
We are currently running Azure Databricks notebooks through Azure Data Factory pipelines, where we use job clusters defined through policies. As an admin, I am able to see the list of all job clusters that have created when ADF makes a call to Databricks notebooks. However, all non-admin users cannot. Where are the permissions for these so that they can see the job clusters ?
Note that we are not using Databricks Workflows so the pipelines do not appear over there. Also, the ADF pipelines are run through a service principal.
Thank you so much for the help,
Sacha
โ07-02-2025 09:27 AM
Yes if you are run a notebook activity from ADF. You will not be able to see the job runs in databricks unless you are admin. But if you want to see any details you can use a python code to see the job runs and details of the run.
You need a Service principle that is admin in databricks workspace. Here is the code.
from databricks.sdk import WorkspaceClient
import json
w = WorkspaceClient(
host = "Your workspace_url",
azure_tenant_id = "",
azure_client_id = "",
azure_client_secret = ""
)
a = w.jobs.list_runs()
for i in a:
if "ADF" in i.run_name:
print(i)
โ06-30-2025 02:22 PM
Hello @sachamourier
Itโs likely that when the job was created, no view permissions were granted to that user. As an admin, youโre able to see the job, but my recommendation is:
Go to the job, scroll down to the bottom of the right-hand sidebar, and under Permissions > Manage permissions, assign the appropriate access level based on your needs โ for example, Can View, Can Manage Run, etc.
I tested it and confirmed that if you donโt have permissions, you are not allowed to see the jobs at all.
Confirm if this matches the issue youโre experiencing. In the job definition in ADF (where you configure the job), youโll be able to add the required permissions for the user or group that needs visibility over the job.
Hereโs the official documentation for reference:
https://docs.databricks.com/gcp/en/jobs/privileges#control-access-to-a-job
Hope this helps, ๐
Isi
โ06-30-2025 02:46 PM
Hello @Isi ,
Thank you for your help ! I noticed the possibility to do that on the right-hand sidebar, but this is going to only provide access to a previously run job with a given ID, and not the future ones I believe.
For example, attached, you can see I have provided "Can view" to the data engineer group, but it's for a single run only. What I want is that for every job run by my ADF. If I run the same execution (notebook) tomorrow, this run is going to have another ID, and the data engineer group is not going to have the "Can view" permission anymore.
In ADF, we are not configuring jobs if this is what you refer to, we only configure Databricks Notebook activities, where a notebook is configured to a linked service that creates a job cluster for itself and then disappears.
I hope my issue is clearer, thank you very much again for helping, it's highly appreciated !
Sacha
โ06-30-2025 03:15 PM
Hi @sachamourier ,
Iโve gone through the documentation, and it seems that there is no direct way to assign permissions through ADF. None of the configuration fields allow setting permissions, and since the job is not persistent and not tied to a group or workflow in Databricks, youโre essentially limited by ADF itself.
My recommendation would be to consider creating a Databricks workflow/job and have ADF trigger it instead. This way, the job is persistent, and you can manage permissions, visibility, and history much more effectively.
An even better alternative โ and what I would personally suggest โ is to create the Databricks jobs programmatically using the Databricks Jobs API. This gives you full control over what gets executed, allows you to easily replicate configurations between executions, and provides better visibility, notifications, and cluster management. While ADFโs Notebook Activity is more convenient from an orchestration perspective, it sacrifices too much in terms of observability and governance, so I believe it should only be used for very specific or isolated use cases.
That said, if you do find a solution for propagating permissions through ADFโs Notebook Activity โ and maybe there is one Iโve missed โ it would be great if you could share it. Iโm sure other users will run into the same limitation.
Hope this helps
Isi
โ07-01-2025 02:51 AM
@Isi , This is really helpful and alternative makes more sense
โ07-02-2025 09:27 AM
Yes if you are run a notebook activity from ADF. You will not be able to see the job runs in databricks unless you are admin. But if you want to see any details you can use a python code to see the job runs and details of the run.
You need a Service principle that is admin in databricks workspace. Here is the code.
from databricks.sdk import WorkspaceClient
import json
w = WorkspaceClient(
host = "Your workspace_url",
azure_tenant_id = "",
azure_client_id = "",
azure_client_secret = ""
)
a = w.jobs.list_runs()
for i in a:
if "ADF" in i.run_name:
print(i)
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now