cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Pass through if a job was run as scheduled or if manual

cmilligan
Contributor II

I have a notebook that sets up parameters for the run based on some job parameters set by the user as well as the current date of the run. I want to supersede some of this logic and just use the manual values if kicked off manually. Is there a way to pass through if a job was kicked off manually or scheduled?

1 ACCEPTED SOLUTION

Accepted Solutions

UmaMahesh1
Honored Contributor III

Hi @Coleman Milligan​ ,

One way to try would be to use widgets in the notebooks you are using setting the widget.get default values to be your job parameters. If you are running the job manually, whatever parameters you provide to the job manually would be taken and used.

Hope my understanding of your requirement was clear.

Cheers..

Uma Mahesh D

View solution in original post

3 REPLIES 3

UmaMahesh1
Honored Contributor III

Hi @Coleman Milligan​ ,

One way to try would be to use widgets in the notebooks you are using setting the widget.get default values to be your job parameters. If you are running the job manually, whatever parameters you provide to the job manually would be taken and used.

Hope my understanding of your requirement was clear.

Cheers..

Uma Mahesh D

SS2
Valued Contributor

You can create widgets by using this- dbutils.widgets.text("widgetName", "")

To get the value for that widget:- dbutils.widgets.get("widgetName")

So by using this you can manually create widgets (variable) and can run the process by giving desired value.

If you want to take the value of widgets from the job parameter then you can also ​do.

Sreekanth1
New Contributor II

Can we set widget.text in one task notebook and another task notebook can get that​ value in workflow job ?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group