<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to dynamically have the parent notebook call on a child notebook? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-to-dynamically-have-the-parent-notebook-call-on-a-child/m-p/122185#M46684</link>
    <description>&lt;P&gt;If you need to pass variables, you would indeed need to use&amp;nbsp;&lt;SPAN&gt;dbutils.notebook.run() instead of %run.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;As far as I'm aware, you can't rerun things between notebook executions. If you need to return a dataframe,&amp;nbsp; the easiest solution seems to create this as a table within unity catalog, and fetch it later when you need it?&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Databricks is not really ment for passing dataframes within notebooks - more create a notebook that runs setup, functions, filtering etc. and create a table, which you can then easily import and use later. That way you also get easier tracability, clarity and seperation of code, which is easier to maintain in the long run.&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Wed, 18 Jun 2025 19:29:36 GMT</pubDate>
    <dc:creator>loui_wentzel</dc:creator>
    <dc:date>2025-06-18T19:29:36Z</dc:date>
    <item>
      <title>How to dynamically have the parent notebook call on a child notebook?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-dynamically-have-the-parent-notebook-call-on-a-child/m-p/122157#M46679</link>
      <description>&lt;P&gt;Hi! I would please like help on how to dynamically call one notebook from another in Databricks and have the parent notebook get the dataframe results from the child notebook. Some background info is that I have a main python notebook and multiple SQL notebooks. The python notebook needs to call on one of the SQL notebooks via a variable for the SQL notebook name and the SQL notebook should return a dataframe to the python notebook. So I need my python notebook to dynamically change the file path name of whichever SQL notebook that I want to call on and this is the part that I am stuck on. Here's what I tried:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Using %run command doesn't allow variables in the filepath name so I'm unable to dynamically call on the SQL notebooks&lt;/LI&gt;&lt;LI&gt;Using dbutils.notebook.run()&lt;I&gt;&amp;nbsp;&lt;/I&gt;allows variables in the filepath name but I don't know how to return the dataframe results from the SQL notebook to the parent python notebook&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;What would be the best way to accomplish what I'm looking for? Thank you so much for any input!&lt;/P&gt;</description>
      <pubDate>Wed, 18 Jun 2025 17:10:14 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-dynamically-have-the-parent-notebook-call-on-a-child/m-p/122157#M46679</guid>
      <dc:creator>alau131</dc:creator>
      <dc:date>2025-06-18T17:10:14Z</dc:date>
    </item>
    <item>
      <title>Re: How to dynamically have the parent notebook call on a child notebook?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-dynamically-have-the-parent-notebook-call-on-a-child/m-p/122185#M46684</link>
      <description>&lt;P&gt;If you need to pass variables, you would indeed need to use&amp;nbsp;&lt;SPAN&gt;dbutils.notebook.run() instead of %run.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;As far as I'm aware, you can't rerun things between notebook executions. If you need to return a dataframe,&amp;nbsp; the easiest solution seems to create this as a table within unity catalog, and fetch it later when you need it?&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Databricks is not really ment for passing dataframes within notebooks - more create a notebook that runs setup, functions, filtering etc. and create a table, which you can then easily import and use later. That way you also get easier tracability, clarity and seperation of code, which is easier to maintain in the long run.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 18 Jun 2025 19:29:36 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-dynamically-have-the-parent-notebook-call-on-a-child/m-p/122185#M46684</guid>
      <dc:creator>loui_wentzel</dc:creator>
      <dc:date>2025-06-18T19:29:36Z</dc:date>
    </item>
    <item>
      <title>Re: How to dynamically have the parent notebook call on a child notebook?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-dynamically-have-the-parent-notebook-call-on-a-child/m-p/122189#M46687</link>
      <description>&lt;P&gt;What you are looking to do is really not the intent of notebooks and you cannot pass complex data types between notebooks. You would need to persist your data frame from the child notebook so your parent notebook could retrieve the results after the child notebook completes.&amp;nbsp; This is inline with what&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/164609"&gt;@loui_wentzel&lt;/a&gt;&amp;nbsp;recommended.&lt;/P&gt;&lt;P&gt;The better pattern here would be to take the logic in each of your child notebooks and create a function for each in a Python library that you could call from your main notebook.&amp;nbsp; After you have your function library, create a (.whl) file, install on the cluster, import the library into your main notebook and make the appropriate function call based upon your business requirements.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 18 Jun 2025 20:40:51 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-dynamically-have-the-parent-notebook-call-on-a-child/m-p/122189#M46687</guid>
      <dc:creator>jameshughes</dc:creator>
      <dc:date>2025-06-18T20:40:51Z</dc:date>
    </item>
  </channel>
</rss>

