cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

%run command: Pass Notebook path as a parameter

lauraxyz
Contributor

Hi team!

I have a Notebook (notebook A) in workspace and I'd like to execute it with %run command from another Notebook (notebook B).  It works perfect with command: 

%run /workspace/path/to/notebook/A

Now, i want to specify above path in a variable, and pass that variable into the %run command, something like this: 

%run $notebook_path

where notebook_path is defined as notebook_path=/workspace/path/to/notebook/A.

However, i keep getting "Notebook not found" error, i also tried %run {notebook_path} which didn't work. How can i make this work?

4 REPLIES 4

Walter_C
Databricks Employee
Databricks Employee

What if you try:

# Define the notebook path
notebook_path = "/workspace/path/to/notebook/A"

# Run the notebook using dbutils.notebook.run
dbutils.notebook.run(notebook_path, 60)

In this example, 60 is the timeout in seconds. You can adjust this value as needed. The dbutils.notebook.run method allows you to pass parameters and handle return values, which provides more flexibility compared to the %run command

lauraxyz
Contributor

 %run commands is preferred in my use case because I need to access the dataframe generated in notebook A from notebook B.  with %run, that's very straightforward, since A is executed inline with the same context. 

If using dbutils.notebook.run, notebook A will be executed in a different namespace, is there an efficient way to get the dataframe generated in A? 

lauraxyz
Contributor

Thanks I think dbutils.notebook.run will work well in my use case.

I was following this, and was able to store the results in a temp view in callee notebook (A), and access results from the caller notebook (B). 

Just a quick question: how often will the temp views stored in "globalTempDatabase" be cleared, is that something we need to configure? Just want to make sure that it'll not affect our job performance by getting too much temp data stored.

Walter_C
Databricks Employee
Databricks Employee

These global temp views are available to all workloads running against a compute resource, but they do not persist beyond the lifecycle of the cluster or the session that created them.

You can refer to https://docs.databricks.com/en/views/index.html#temporary-views 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now