<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Making dynamic tasks like in Airflow, but in Databricks? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/making-dynamic-tasks-like-in-airflow-but-in-databricks/m-p/118982#M45753</link>
    <description>&lt;P class="_1t7bu9h1 paragraph"&gt;Yes, it is possible to create dynamic tasks in Databricks workflows, similar to the approach using Apache Airflow, by leveraging Databricks' job orchestration capabilities. However, the implementation may differ from Airflow's dynamic DAG creation.&lt;/P&gt;
&lt;P class="_1t7bu9h1 paragraph"&gt;Databricks supports multi-task jobs, allowing users to specify multiple tasks (e.g., notebooks, JARs, Python scripts) within a single job. These tasks can include dependencies and parameterized execution workflows where tasks dynamically use inputs such as table names. The parameterization functionality in Databricks jobs enables passing parameters to notebooks, facilitating dynamic task execution without manually duplicating notebooks for each input&lt;/P&gt;
&lt;P&gt;Airflow itself integrates with Databricks via operators like &lt;CODE&gt;DatabricksRunNowOperator&lt;/CODE&gt; and &lt;CODE&gt;DatabricksSubmitRunOperator&lt;/CODE&gt; from the Airflow Databricks provider. These operators can trigger tasks or workflows defined in Databricks jobs, leveraging dynamic input arguments to execute tasks for specific items or tables. This integration enables Airflow DAGs to dynamically orchestrate tasks based on external criteria such as lists of table names.&lt;/P&gt;
&lt;P&gt;See for more details&amp;nbsp;&lt;A href="https://docs.databricks.com/gcp/en/jobs/how-to/use-airflow-with-jobs" target="_blank"&gt;https://docs.databricks.com/gcp/en/jobs/how-to/use-airflow-with-jobs&lt;/A&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 13 May 2025 02:12:35 GMT</pubDate>
    <dc:creator>kamal_ch</dc:creator>
    <dc:date>2025-05-13T02:12:35Z</dc:date>
    <item>
      <title>Making dynamic tasks like in Airflow, but in Databricks?</title>
      <link>https://community.databricks.com/t5/data-engineering/making-dynamic-tasks-like-in-airflow-but-in-databricks/m-p/110143#M43495</link>
      <description>&lt;P&gt;I've used Airflow which allows us to create a DAG with dynamic tasks, for example we can have a list of items (such as table names),&amp;nbsp; loop through an operator that accepts a table name and create a task for each table without having to create a new notebook for each table.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is this possible in Databricks as well?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 13 Feb 2025 17:50:16 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/making-dynamic-tasks-like-in-airflow-but-in-databricks/m-p/110143#M43495</guid>
      <dc:creator>KristiLogos</dc:creator>
      <dc:date>2025-02-13T17:50:16Z</dc:date>
    </item>
    <item>
      <title>Re: Making dynamic tasks like in Airflow, but in Databricks?</title>
      <link>https://community.databricks.com/t5/data-engineering/making-dynamic-tasks-like-in-airflow-but-in-databricks/m-p/118982#M45753</link>
      <description>&lt;P class="_1t7bu9h1 paragraph"&gt;Yes, it is possible to create dynamic tasks in Databricks workflows, similar to the approach using Apache Airflow, by leveraging Databricks' job orchestration capabilities. However, the implementation may differ from Airflow's dynamic DAG creation.&lt;/P&gt;
&lt;P class="_1t7bu9h1 paragraph"&gt;Databricks supports multi-task jobs, allowing users to specify multiple tasks (e.g., notebooks, JARs, Python scripts) within a single job. These tasks can include dependencies and parameterized execution workflows where tasks dynamically use inputs such as table names. The parameterization functionality in Databricks jobs enables passing parameters to notebooks, facilitating dynamic task execution without manually duplicating notebooks for each input&lt;/P&gt;
&lt;P&gt;Airflow itself integrates with Databricks via operators like &lt;CODE&gt;DatabricksRunNowOperator&lt;/CODE&gt; and &lt;CODE&gt;DatabricksSubmitRunOperator&lt;/CODE&gt; from the Airflow Databricks provider. These operators can trigger tasks or workflows defined in Databricks jobs, leveraging dynamic input arguments to execute tasks for specific items or tables. This integration enables Airflow DAGs to dynamically orchestrate tasks based on external criteria such as lists of table names.&lt;/P&gt;
&lt;P&gt;See for more details&amp;nbsp;&lt;A href="https://docs.databricks.com/gcp/en/jobs/how-to/use-airflow-with-jobs" target="_blank"&gt;https://docs.databricks.com/gcp/en/jobs/how-to/use-airflow-with-jobs&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 13 May 2025 02:12:35 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/making-dynamic-tasks-like-in-airflow-but-in-databricks/m-p/118982#M45753</guid>
      <dc:creator>kamal_ch</dc:creator>
      <dc:date>2025-05-13T02:12:35Z</dc:date>
    </item>
  </channel>
</rss>

