Run now with different parameters doesn't pass parameter to pipeline tasks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-09-2024 08:43 AM
I have a job with some tasks. Some of the tasks are pipeline_task's some are notebook_task's.
When I run the job with "Run now with different parameters" and enter a new key-value, I see that the key-value is available in the notebook_task's with dbutils.widgets.get, but is NOT available in my pipelines with neither spark.conf.get nor dbutils.widgets.get.
For example, if I "run now with different parameters" and pass in the key "fav_color" with the value "blue", then dbutils.widgets.get("fav_color") and spark.conf.get("fav_color") do NOT exist in my pipeline_task. In my notebook_task, I AM able to get the value from dbutils.widgets.get("fav_color").
My question is: How do I get these new parameters in my pipeline_task that were added via "run now with different parameters"?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-13-2024 10:21 AM
As per docs it seems that pipeline task type is currently not supported to pass parameters: https://docs.databricks.com/en/jobs/create-run-jobs.html#pass-parameters-to-a-databricks-job-task
You could create a notebook task that runs before your pipeline task. This notebook task would take the job parameters and write them to a location accessible by your pipeline task, such as DBFS (Databricks File System) or a database. Then, your pipeline task could read these parameters from that location.