Hi everyone,
Iโm experimenting with the Databricks VS Code extension, using Spark Connect to run code locally in my Python environment while connecting to a Databricks cluster. Iโm trying to call one notebook from another via:
notebook_params = {
"catalog_name": "dev",
"bronze_schema_name": "bronze",
"silver_schema_name": "silver",
"source_table_name": "source_table",
"target_table_name": "target_table"
}
# Run the notebook with the specified parameters
result = dbutils.notebook.run("merge_table_silver_target_table", 60, notebook_params)
However, I am getting the error:
From what Iโve read, it seems dbutils.notebook.run() relies on the full Databricks notebook context, which might not exist in a local Spark Connect session.
Ask per documentation:
Could you confirm running a notebook with dbutils.notebook.run is not possible in local environment using Databricks-Connect?