cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

current date as default in a widget while scheduling the notebook

philip
New Contributor

I have a scheduled a notebook. can I keep current date as default in widget whenever the notebook run and also i need the flexibility to change the widget value to any other date based on the ad hoc run that I do.

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III

second parameter is default value, here is example code:

data = [{"Category": 'Category A', "ID": 1, "Value": 12.40},
        {"Category": 'Category B', "ID": 2, "Value": 30.10},
        {"Category": 'Category C', "ID": 3, "Value": 100.01}
        ]
df = spark.createDataFrame(data)
 
values = [row.ID for row in df.select('ID').collect()]
default_value = str(max(values))
values_str = [str(value) for value in values]
dbutils.widgets.dropdown("ID", default_value, values_str)

View solution in original post

3 REPLIES 3

Hubert-Dudek
Esteemed Contributor III

second parameter is default value, here is example code:

data = [{"Category": 'Category A', "ID": 1, "Value": 12.40},
        {"Category": 'Category B', "ID": 2, "Value": 30.10},
        {"Category": 'Category C', "ID": 3, "Value": 100.01}
        ]
df = spark.createDataFrame(data)
 
values = [row.ID for row in df.select('ID').collect()]
default_value = str(max(values))
values_str = [str(value) for value in values]
dbutils.widgets.dropdown("ID", default_value, values_str)

-werners-
Esteemed Contributor III

So building on the answer of Hubert:

from datetime import date

date_for_widget = date.today()

So if you use date_for_widget as your default value, you are there.

And ofc you can fill this date_for_widget variable with anything you want.

You can even fetch a value from outside databricks, f.e. with Azure Data Factory.

Kaniz
Community Manager
Community Manager

Hi @philip george​ , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)​ 's and @Werner Stinckens​'s response help you to find the solution? Please let us know.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.