cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

%run command: Pass Notebook path as a parameter

lauraxyz
New Contributor III

Hi team!

I have a Notebook (notebook A) in workspace and I'd like to execute it with %run command from another Notebook (notebook B).  It works perfect with command: 

%run /workspace/path/to/notebook/A

Now, i want to specify above path in a variable, and pass that variable into the %run command, something like this: 

%run $notebook_path

where notebook_path is defined as notebook_path=/workspace/path/to/notebook/A.

However, i keep getting "Notebook not found" error, i also tried %run {notebook_path} which didn't work. How can i make this work?

4 REPLIES 4

Walter_C
Databricks Employee
Databricks Employee

What if you try:

# Define the notebook path
notebook_path = "/workspace/path/to/notebook/A"

# Run the notebook using dbutils.notebook.run
dbutils.notebook.run(notebook_path, 60)

In this example, 60 is the timeout in seconds. You can adjust this value as needed. The dbutils.notebook.run method allows you to pass parameters and handle return values, which provides more flexibility compared to the %run command

lauraxyz
New Contributor III

 %run commands is preferred in my use case because I need to access the dataframe generated in notebook A from notebook B.  with %run, that's very straightforward, since A is executed inline with the same context. 

If using dbutils.notebook.run, notebook A will be executed in a different namespace, is there an efficient way to get the dataframe generated in A? 

lauraxyz
New Contributor III

Thanks I think dbutils.notebook.run will work well in my use case.

I was following this, and was able to store the results in a temp view in callee notebook (A), and access results from the caller notebook (B). 

Just a quick question: how often will the temp views stored in "globalTempDatabase" be cleared, is that something we need to configure? Just want to make sure that it'll not affect our job performance by getting too much temp data stored.

Walter_C
Databricks Employee
Databricks Employee

These global temp views are available to all workloads running against a compute resource, but they do not persist beyond the lifecycle of the cluster or the session that created them.

You can refer to https://docs.databricks.com/en/views/index.html#temporary-views 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group