cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Is dbutils.notebook.run() supported from a local Spark Connect environment (VS Code)?

filipniziol
Contributor III

Hi everyone,

Iโ€™m experimenting with the Databricks VS Code extension, using Spark Connect to run code locally in my Python environment while connecting to a Databricks cluster. Iโ€™m trying to call one notebook from another via:

 

notebook_params = {
    "catalog_name": "dev",
    "bronze_schema_name": "bronze",
    "silver_schema_name": "silver",
    "source_table_name": "source_table",
    "target_table_name": "target_table"
}

# Run the notebook with the specified parameters
result = dbutils.notebook.run("merge_table_silver_target_table", 60, notebook_params)

 

However, I am getting the error:

filipniziol_0-1735481046190.png

From what Iโ€™ve read, it seems dbutils.notebook.run() relies on the full Databricks notebook context, which might not exist in a local Spark Connect session.

Ask per documentation:

filipniziol_1-1735481141816.png

Could you confirm running a notebook with dbutils.notebook.run is not possible in local environment using Databricks-Connect?

1 ACCEPTED SOLUTION

Accepted Solutions

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @filipniziol,

it is confirmed that dbutils.notebook.run relies on the full Databricks notebook context, which is not available in a local Spark Connect session. Therefore, running a notebook with dbutils.notebook.run is not possible in a local environment using Databricks-Connect.

View solution in original post

2 REPLIES 2

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @filipniziol,

it is confirmed that dbutils.notebook.run relies on the full Databricks notebook context, which is not available in a local Spark Connect session. Therefore, running a notebook with dbutils.notebook.run is not possible in a local environment using Databricks-Connect.

VZLA
Databricks Employee
Databricks Employee

@filipniziol just curious to know if getting the context and setting it manually would help, have you tried this approach?

Example:

ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext()
dbutils.notebook.setContext(ctx)

Or

from pyspark.dbutils import DBUtils
from pyspark.sql import SparkSession

spark = SparkSession.getActiveSession()
_dbutils = DBUtils().get_dbutils(spark)

notebook_context = _dbutils.notebook.entry_point.getDbutils().notebook().getContext()
notebook_context.clusterId().get()

The above, should return some id, eg.: '1231-135641-d7rr9qht-v2y', so I'm curious to know if setting the ctx id via the dbutils.notebook.setContext() would help making it go through.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group