%run command: Pass Notebook path as a parameter
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-26-2024 10:45 AM
Hi team!
I have a Notebook (notebook A) in workspace and I'd like to execute it with %run command from another Notebook (notebook B). It works perfect with command:
%run /workspace/path/to/notebook/A
Now, i want to specify above path in a variable, and pass that variable into the %run command, something like this:
%run $notebook_path
where notebook_path is defined as notebook_path=/workspace/path/to/notebook/A.
However, i keep getting "Notebook not found" error, i also tried %run {notebook_path} which didn't work. How can i make this work?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-26-2024 11:58 AM
What if you try:
# Define the notebook path
notebook_path = "/workspace/path/to/notebook/A"
# Run the notebook using dbutils.notebook.run
dbutils.notebook.run(notebook_path, 60)
In this example, 60
is the timeout in seconds. You can adjust this value as needed. The dbutils.notebook.run
method allows you to pass parameters and handle return values, which provides more flexibility compared to the %run
command
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-26-2024 12:30 PM
%run commands is preferred in my use case because I need to access the dataframe generated in notebook A from notebook B. with %run, that's very straightforward, since A is executed inline with the same context.
If using dbutils.notebook.run, notebook A will be executed in a different namespace, is there an efficient way to get the dataframe generated in A?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-26-2024 03:30 PM
Thanks I think dbutils.notebook.run will work well in my use case.
I was following this, and was able to store the results in a temp view in callee notebook (A), and access results from the caller notebook (B).
Just a quick question: how often will the temp views stored in "globalTempDatabase" be cleared, is that something we need to configure? Just want to make sure that it'll not affect our job performance by getting too much temp data stored.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-26-2024 06:16 PM
These global temp views are available to all workloads running against a compute resource, but they do not persist beyond the lifecycle of the cluster or the session that created them.
You can refer to https://docs.databricks.com/en/views/index.html#temporary-views

