<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Is dbutils.notebook.run() supported from a local Spark Connect environment (VS Code)? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/is-dbutils-notebook-run-supported-from-a-local-spark-connect/m-p/103701#M41555</link>
    <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/117376"&gt;@filipniziol&lt;/a&gt;&amp;nbsp;just curious to know if getting the context and setting it manually would help, have you tried this approach?&lt;/P&gt;
&lt;P&gt;Example:&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext()
dbutils.notebook.setContext(ctx)&lt;/LI-CODE&gt;
&lt;P&gt;Or&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;from pyspark.dbutils import DBUtils
from pyspark.sql import SparkSession

spark = SparkSession.getActiveSession()
_dbutils = DBUtils().get_dbutils(spark)

notebook_context = _dbutils.notebook.entry_point.getDbutils().notebook().getContext()
notebook_context.clusterId().get()&lt;/LI-CODE&gt;
&lt;P&gt;The above, should return some id, eg.: &lt;SPAN&gt;'1231-135641-d7rr9qht-v2y', so I'm curious to know if setting the ctx id via the dbutils.notebook.setContext() would help making it go through.&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Tue, 31 Dec 2024 14:01:02 GMT</pubDate>
    <dc:creator>VZLA</dc:creator>
    <dc:date>2024-12-31T14:01:02Z</dc:date>
    <item>
      <title>Is dbutils.notebook.run() supported from a local Spark Connect environment (VS Code)?</title>
      <link>https://community.databricks.com/t5/data-engineering/is-dbutils-notebook-run-supported-from-a-local-spark-connect/m-p/103452#M41443</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;I’m experimenting with the Databricks VS Code extension, using &lt;STRONG&gt;Spark Connect&lt;/STRONG&gt; to run code locally in my Python environment while connecting to a Databricks cluster. I’m trying to call one notebook from another via:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;notebook_params = {
    "catalog_name": "dev",
    "bronze_schema_name": "bronze",
    "silver_schema_name": "silver",
    "source_table_name": "source_table",
    "target_table_name": "target_table"
}

# Run the notebook with the specified parameters
result = dbutils.notebook.run("merge_table_silver_target_table", 60, notebook_params)&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However, I am getting the error:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="filipniziol_0-1735481046190.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/13753iA33F58883278A8C8/image-size/medium?v=v2&amp;amp;px=400" role="button" title="filipniziol_0-1735481046190.png" alt="filipniziol_0-1735481046190.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;From what I’ve read, it seems dbutils.notebook.run() relies on the full Databricks notebook context, which might not exist in a local Spark Connect session.&lt;BR /&gt;&lt;BR /&gt;Ask per &lt;A href="https://docs.databricks.com/en/dev-tools/sdk-python.html#use-databricks-utilities" target="_self"&gt;documentation&lt;/A&gt;:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="filipniziol_1-1735481141816.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/13754i55A266D8261B0EFA/image-size/medium?v=v2&amp;amp;px=400" role="button" title="filipniziol_1-1735481141816.png" alt="filipniziol_1-1735481141816.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Could you confirm running a notebook with dbutils.notebook.run is not possible in local environment using Databricks-Connect?&lt;/P&gt;</description>
      <pubDate>Sun, 29 Dec 2024 14:07:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/is-dbutils-notebook-run-supported-from-a-local-spark-connect/m-p/103452#M41443</guid>
      <dc:creator>filipniziol</dc:creator>
      <dc:date>2024-12-29T14:07:02Z</dc:date>
    </item>
    <item>
      <title>Re: Is dbutils.notebook.run() supported from a local Spark Connect environment (VS Code)?</title>
      <link>https://community.databricks.com/t5/data-engineering/is-dbutils-notebook-run-supported-from-a-local-spark-connect/m-p/103478#M41451</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/117376"&gt;@filipniziol&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;it is confirmed that &lt;CODE&gt;dbutils.notebook.run&lt;/CODE&gt; relies on the full Databricks notebook context, which is not available in a local Spark Connect session. Therefore, running a notebook with &lt;CODE&gt;dbutils.notebook.run&lt;/CODE&gt; is not possible in a local environment using Databricks-Connect.&lt;/P&gt;</description>
      <pubDate>Sun, 29 Dec 2024 22:47:28 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/is-dbutils-notebook-run-supported-from-a-local-spark-connect/m-p/103478#M41451</guid>
      <dc:creator>Alberto_Umana</dc:creator>
      <dc:date>2024-12-29T22:47:28Z</dc:date>
    </item>
    <item>
      <title>Re: Is dbutils.notebook.run() supported from a local Spark Connect environment (VS Code)?</title>
      <link>https://community.databricks.com/t5/data-engineering/is-dbutils-notebook-run-supported-from-a-local-spark-connect/m-p/103701#M41555</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/117376"&gt;@filipniziol&lt;/a&gt;&amp;nbsp;just curious to know if getting the context and setting it manually would help, have you tried this approach?&lt;/P&gt;
&lt;P&gt;Example:&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext()
dbutils.notebook.setContext(ctx)&lt;/LI-CODE&gt;
&lt;P&gt;Or&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;from pyspark.dbutils import DBUtils
from pyspark.sql import SparkSession

spark = SparkSession.getActiveSession()
_dbutils = DBUtils().get_dbutils(spark)

notebook_context = _dbutils.notebook.entry_point.getDbutils().notebook().getContext()
notebook_context.clusterId().get()&lt;/LI-CODE&gt;
&lt;P&gt;The above, should return some id, eg.: &lt;SPAN&gt;'1231-135641-d7rr9qht-v2y', so I'm curious to know if setting the ctx id via the dbutils.notebook.setContext() would help making it go through.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 31 Dec 2024 14:01:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/is-dbutils-notebook-run-supported-from-a-local-spark-connect/m-p/103701#M41555</guid>
      <dc:creator>VZLA</dc:creator>
      <dc:date>2024-12-31T14:01:02Z</dc:date>
    </item>
  </channel>
</rss>

