cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How do I pass arguments/variables to notebooks?

__Databricks_Su
Contributor
 
1 ACCEPTED SOLUTION

Accepted Solutions

@kruhly Unfortunately right now you can't pass specific variable names, only constants. It's on our roadmap to improve/overhaul parameter passing, which would include this capability.

View solution in original post

17 REPLIES 17

__Databricks_Su
Contributor

Executing NotebookB from NotebookA with arguments, you would use the following syntax within NotebookA to define the arguments:

%run path/to/NotebookB $VarA="ValueA" $VarB="ValueB"

Within NotebookB, you'd use the following to receive the argument value:

Scala and Python:

print getArgument("VariableName", "DefaultValue")   

SQL (No default value is supported):

select * from $VariableName

should'nt the SQL be

select * from $VarA

Is there a way to pass a variable's value from another cell to NotebookB? For example,

----Cell1------------------------

john = 10

-----Cell2--------------------

%run path/to/NotebookB $VarA = john

submits "john" to NotebookB not the value of 10

@kruhly Unfortunately right now you can't pass specific variable names, only constants. It's on our roadmap to improve/overhaul parameter passing, which would include this capability.

osr2020
New Contributor II

where there any updates in this regard?

it's been 1.5 years, so i'm hoping this capability was added by now.

Looks like this feature is still not available. is there any plan to pass parameter values from another cell when running the notebook via %run?

Dior
New Contributor II

Using dbutils.notebook api helped with this. The command follows this syntax:

dbutils.notebook.run("notebook", "timeout_secounds", {"widget_value": " "})

To get a variable in there just do this:

x = "value"

dbutils.notebook.run("run_weekly_notebook", 60, {"stores_available": f"{x}"})

Is there any plan to support default argument values for SQL?

@Databricks_Support, When I try the above, first %run tells me it cannot find the notebook (albeit they are in the same folder. I do not know how to set path\to\notebook right). Second, the getArgument gives me an error

py4j.Py4JException: Method createTextWidget([class java.lang.String, class java.lang.Integer, null, null]) does not exist
. I am using Python notebooks.

Is there any similir functionality for an R notebook?

SimeonSimeonov_
New Contributor II

@Oren Benjamin you can use a UDF in SQL and then, in the UDF's code, you can use the Scala/Python form.

naman1994
New Contributor III

If you are running a notebook from another notebook, then use dbutils.notebook.run(path = " ", args={}, timeout='120'), you can pass variables in args = {}. And you will use dbutils.widget.get() in the notebook to receive the variable.

And if you are not running a notebook from another notebook, and just want to a variable before running particular cell in the notebook, use dbutils.widget.text("a","b","c") and dbutils.widget.get().

dbutils.widget.text("a","b","c") :Set the value; a,b,c are mandatory, a : name of the variable, b : default value, c : name of the textbox

dbutils.widget.get(): Assign the value to variable, I am not sure about the arguments here, look it up.

lshar
New Contributor III

Hello, is there already a solution to this problem?

I have found a workaround for my use case. Hovewer, the workaround seems to have a databricks failure since without using the if statement the %run code is working and with if - an error is displayed as can be seen in the screenshot.

I am very interested in a solution!

Meghala
Valued Contributor II

This info also needed to me as well​

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.