Customize job run name when running jobs from adf
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-20-2023 01:01 PM
Hi guys,
i am running my Databricks jobs on a cluster job from azure datafactory using a databricks Python activity
When I monitor my jobs in workflow-> job runs . I see that the run name is a concatenation of adf pipeline name , Databricks python activity name and a run id. I am wondering how can Customize this job run name to make it easier to select the a specific job ?
is there a way to do it in spark session or spark context properties ?
regards
hamza
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-17-2024 06:44 AM
Hi Hamza,
did u got any solution for this issue
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-23-2024 06:58 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-23-2024 08:28 AM
I don't think that level of customisation is provided. However, I can suggest some workarounds:
- REST API: Create a job on the fly with desired name within ADF and trigger it using REST API in Web activity. This way you can track job completion status and decide what action to take if the job run was success or failure.
- File trigger: Drop a file at a location using ADF, and set file arrival trigger in Databricks job. This will ensure that you're running the same job in Databricks.
Hope this helps.

