cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks job/task shows success but original notebook is not updated

Sneha1594
New Contributor III

I have uploaded parquet files to hive meta store tables then performed some transformations on data and generated some visualizations. All this is done in a notebook.

I have scheduled the notebook for every morning so that I get a refreshed view of data. The job shows succeeded every day however hive tables are not updated and the original notebook too.

After I click the job, I see an updated notebook within the job but I want to see the original notebook to be updated. Am I doing something wrong?

6 REPLIES 6

Prabakar
Esteemed Contributor III
Esteemed Contributor III

hi @Sneha Mulrajani​ the original notebook won't be updated. The job uses the notebook to perform the task and displays the result in the jobs page. you can see the result in the output provided by the job run.

If you want the result to be available in the original notebook, then the only option is to attach a cluster to the notebook and execute the notebook manually.

hi @Sneha Mulrajani​ why do you want to have the result in the original notebook when you run it as a job. today you will get one result and tomorrow you will get different result. if you have jobs you can differentiate or compare the results. if you run in the same notebook, you dont have the option to see the past result.(though you can check from revision history, this is not a good approach)

Sneha1594
New Contributor III

Thanks @Prabakar Ammeappin​ and @Databricks learner​ 

I need the notebook to be updated to see the visualizations I have created within the notebook. There are widgets in the notebook which are used as input(filters) for the visualization. Widgets aren't available when I see results in the job

Prabakar
Esteemed Contributor III
Esteemed Contributor III

For widgets, you can use parameters in your jobs.

https://docs.databricks.com/workflows/jobs/jobs.html#run-a-job-with-different-parameters

You can enter parameters as key-value pairs or a JSON object. The provided parameters are merged with the default parameters for the triggered run. You can use this dialog to set the values of widgets.

Vidula
Honored Contributor

Hi @Sneha Mulrajani​ 

Does @Prabakar Ammeappin​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?

We'd love to hear from you.

Thanks!

Sneha1594
New Contributor III

Done!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.