<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Passing Parameters *between* Workflow run_job steps in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153652#M53988</link>
    <description>&lt;P&gt;I think this makes a lot of sense.&lt;/P&gt;&lt;P&gt;One follow-up question, do Lakeflow Jobs in some way support the ability for a Child task to create or update a parent task value? An example in the context I shared earlier:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;We have a Lakeflow Job with two sequential run_job steps&lt;/LI&gt;&lt;LI&gt;The first run_job (let's call it Parent 1) has a notebook task (let's call it Child 1)&lt;/LI&gt;&lt;LI&gt;&amp;nbsp;Somehow (if possible) I would like the notebook task to use `&lt;SPAN&gt;dbutils.jobs.taskValues.set()&lt;/SPAN&gt;` referencing the Parent 1 as the task&lt;/LI&gt;&lt;LI&gt;If this can be achieved, when we zoom out the Parent 1 has the taskValue already set, and can then be referenced in the following run_job (let's call it Parent 2)&lt;/LI&gt;&lt;LI&gt;Once Parent 2 starts running, all Child 2 related tasks would have access to the referenced variable&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Let me know if this makes sense and if it's even possible?&lt;/P&gt;</description>
    <pubDate>Tue, 07 Apr 2026 19:36:24 GMT</pubDate>
    <dc:creator>ChristianRRL</dc:creator>
    <dc:date>2026-04-07T19:36:24Z</dc:date>
    <item>
      <title>Passing Parameters *between* Workflow run_job steps</title>
      <link>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153562#M53972</link>
      <description>&lt;P&gt;Hi there, I'm trying to reference a task value - let's call it `output_path` (not known until programmatically generated by the code) - that is created in a nested task (Child 1) within a run_job (Parent 1) as an input parameter - let's call it `input_path` - for a downstream run_job (Parent 2). I understand that due to the way variable scoping works, this may not be typically possible and am looking into some possible ways to do this.&lt;/P&gt;&lt;P&gt;Some approaches I'm considering currently:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Create a "placeholder" task or run_job parameter variable that is updated by the nested task (Child 1)&lt;UL&gt;&lt;LI&gt;Pro: explicit and clear reference of variable&lt;/LI&gt;&lt;LI&gt;Con: more challenging to scale + seems a bit brittle&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;UL&gt;&lt;LI&gt;Use the REST API `&lt;A href="https://docs.databricks.com/api/workspace/jobs/getrunoutput" target="_self"&gt;/api/2.2/jobs/runs/get-output&lt;/A&gt;` to set &amp;amp; get the variable&lt;/LI&gt;&lt;LI&gt;Pro: overall seems easier to scale&lt;UL&gt;&lt;LI&gt;Con: more challenging to implement + it requiring the value to be passed through `dbutils.notebook.exit()` seems a bit limiting&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Please let me know if there are other/better approaches I may not be considering, or else if one of the above options is generally more or less recommended.&lt;/P&gt;&lt;DIV class=""&gt;&amp;nbsp;&lt;/DIV&gt;&lt;P&gt;NOTE: Trying to paste an image, but lately the paste functionality has not been working. Attached a reference image as well in case the image paste didn't go through&lt;/P&gt;</description>
      <pubDate>Mon, 06 Apr 2026 19:56:06 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153562#M53972</guid>
      <dc:creator>ChristianRRL</dc:creator>
      <dc:date>2026-04-06T19:56:06Z</dc:date>
    </item>
    <item>
      <title>Re: Passing Parameters *between* Workflow run_job steps</title>
      <link>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153566#M53974</link>
      <description>&lt;P&gt;Quick update, my question effectively boils down to:&lt;/P&gt;&lt;P class="lia-indent-padding-left-30px"&gt;Do databricks workflows have &lt;U&gt;&lt;STRONG&gt;"global" variables&lt;/STRONG&gt;&lt;/U&gt; that can be set programmatically from anywhere in the workflow (e.g. nested notebook task inside a parent run_job task) during runtime and be referenced anywhere else in the workflow, regardless of scope?&lt;/P&gt;&lt;P&gt;Consulting with LLMs, I have some partial answers but still would appreciate some feedback from the community!&lt;/P&gt;&lt;P&gt;Updates on my considered approaches:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;The first option I think wouldn't work as I was hoping due to variable scoping&lt;/LI&gt;&lt;LI&gt;The second option seems like it's still a viable option, but the same challenges/trickiness persist&lt;/LI&gt;&lt;LI&gt;Other options I've seen proposed elsewhere:&lt;UL&gt;&lt;LI&gt;&lt;SPAN&gt;DBFS/Cloud Storage (e.g. file with runtime information saved and referenced elsewhere during job run)&lt;/SPAN&gt;&lt;/LI&gt;&lt;LI&gt;&lt;SPAN&gt;External DB/Table (e.g.&amp;nbsp;tasks read/write key-value pairs to a shared Delta table or external database)&lt;/SPAN&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Mon, 06 Apr 2026 21:09:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153566#M53974</guid>
      <dc:creator>ChristianRRL</dc:creator>
      <dc:date>2026-04-06T21:09:22Z</dc:date>
    </item>
    <item>
      <title>Re: Passing Parameters *between* Workflow run_job steps</title>
      <link>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153606#M53976</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/96188"&gt;@ChristianRRL&lt;/a&gt;,&lt;/P&gt;
&lt;P class="p8i6j01 paragraph"&gt;No. As of now, Lakeflow Jobs doesn’t provide global, mutable variables that you can set from any task and read from any other task, regardless of scope. This is a current limitation of the platform...&amp;nbsp;&lt;/P&gt;
&lt;P&gt;I think you’ve already explored the supported patterns (job parameters, task values, etc.). I'm assuming you have a reason to keep the computation inside a separate child job. If so, the most robust option is to&amp;nbsp;persist output_path to an external store (for example, a Delta table or a Unity Catalog volume / external location)&amp;nbsp;in the child job.&amp;nbsp;In the parent job, add a notebook task that reads that value and re-exposes it via dbutils.jobs.taskValues.set, and then reference it in downstream tasks using a dynamic value reference like {{tasks.&amp;lt;task_name&amp;gt;.values.output_path}}.&lt;/P&gt;
&lt;P&gt;Using GET /api/2.1/jobs/runs/get-output doesn’t give you a global variable either.&amp;nbsp;It’s read-only in the sense&amp;nbsp;you can’t set a variable in Lakeflow Jobs with it.&amp;nbsp;It works best for an external orchestrator pattern (external code runs Parent 1, calls get-output, then starts Parent 2 with that value as a job parameter).&lt;/P&gt;
&lt;P class="p8i6j01 paragraph"&gt;Avoid using workspace DBFS for this kind of cross-job state. Prefer Unity Catalog managed storage instead.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;I&gt;&lt;/I&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 07 Apr 2026 09:31:28 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153606#M53976</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-04-07T09:31:28Z</dc:date>
    </item>
    <item>
      <title>Re: Passing Parameters *between* Workflow run_job steps</title>
      <link>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153652#M53988</link>
      <description>&lt;P&gt;I think this makes a lot of sense.&lt;/P&gt;&lt;P&gt;One follow-up question, do Lakeflow Jobs in some way support the ability for a Child task to create or update a parent task value? An example in the context I shared earlier:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;We have a Lakeflow Job with two sequential run_job steps&lt;/LI&gt;&lt;LI&gt;The first run_job (let's call it Parent 1) has a notebook task (let's call it Child 1)&lt;/LI&gt;&lt;LI&gt;&amp;nbsp;Somehow (if possible) I would like the notebook task to use `&lt;SPAN&gt;dbutils.jobs.taskValues.set()&lt;/SPAN&gt;` referencing the Parent 1 as the task&lt;/LI&gt;&lt;LI&gt;If this can be achieved, when we zoom out the Parent 1 has the taskValue already set, and can then be referenced in the following run_job (let's call it Parent 2)&lt;/LI&gt;&lt;LI&gt;Once Parent 2 starts running, all Child 2 related tasks would have access to the referenced variable&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;Let me know if this makes sense and if it's even possible?&lt;/P&gt;</description>
      <pubDate>Tue, 07 Apr 2026 19:36:24 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153652#M53988</guid>
      <dc:creator>ChristianRRL</dc:creator>
      <dc:date>2026-04-07T19:36:24Z</dc:date>
    </item>
    <item>
      <title>Re: Passing Parameters *between* Workflow run_job steps</title>
      <link>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153653#M53989</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/96188"&gt;@ChristianRRL&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;No. Lakeflow Jobs don’t support a child job/task setting or updating a parent job’s task values.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;dbutils.jobs.taskValues.set() always writes a value for the current task in the current job run. There is no way to target a different task or a different job (like the Run Job parent).&lt;/P&gt;
&lt;P&gt;Run Job creates a separate job run. Its task values remain scoped to that child job and cannot become the task values of the parent’s Run Job task, nor can they&amp;nbsp;be read by a sibling Run Job (your Parent 2).&lt;/P&gt;
&lt;P&gt;To get your pattern working, you still need to either move Child 1 into the same Lakeflow job as Parent 2 and use task values normally, or have Child 1 persist output_path to UC-managed storage, then in the parent job read it and re-expose it via dbutils.jobs.taskValues.set, which Parent 2 and its children can then reference.&lt;/P&gt;
&lt;P class="p1"&gt;&lt;FONT size="2" color="#FF6600"&gt;&lt;STRONG&gt;&lt;I&gt;If this answer resolves your question, could you mark it as “Accept as Solution”? That helps other users quickly find the correct fix.&lt;/I&gt;&lt;/STRONG&gt;&lt;/FONT&gt;&lt;I&gt;&lt;/I&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 07 Apr 2026 19:54:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/passing-parameters-between-workflow-run-job-steps/m-p/153653#M53989</guid>
      <dc:creator>Ashwin_DSA</dc:creator>
      <dc:date>2026-04-07T19:54:08Z</dc:date>
    </item>
  </channel>
</rss>

