cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks job/task shows success but original notebook is not updated

Sneha1594
New Contributor III

I have uploaded parquet files to hive meta store tables then performed some transformations on data and generated some visualizations. All this is done in a notebook.

I have scheduled the notebook for every morning so that I get a refreshed view of data. The job shows succeeded every day however hive tables are not updated and the original notebook too.

After I click the job, I see an updated notebook within the job but I want to see the original notebook to be updated. Am I doing something wrong?

6 REPLIES 6

Prabakar
Databricks Employee
Databricks Employee

hi @Sneha Mulrajani​ the original notebook won't be updated. The job uses the notebook to perform the task and displays the result in the jobs page. you can see the result in the output provided by the job run.

If you want the result to be available in the original notebook, then the only option is to attach a cluster to the notebook and execute the notebook manually.

hi @Sneha Mulrajani​ why do you want to have the result in the original notebook when you run it as a job. today you will get one result and tomorrow you will get different result. if you have jobs you can differentiate or compare the results. if you run in the same notebook, you dont have the option to see the past result.(though you can check from revision history, this is not a good approach)

Sneha1594
New Contributor III

Thanks @Prabakar Ammeappin​ and @Databricks learner​ 

I need the notebook to be updated to see the visualizations I have created within the notebook. There are widgets in the notebook which are used as input(filters) for the visualization. Widgets aren't available when I see results in the job

Prabakar
Databricks Employee
Databricks Employee

For widgets, you can use parameters in your jobs.

https://docs.databricks.com/workflows/jobs/jobs.html#run-a-job-with-different-parameters

You can enter parameters as key-value pairs or a JSON object. The provided parameters are merged with the default parameters for the triggered run. You can use this dialog to set the values of widgets.

Vidula
Honored Contributor

Hi @Sneha Mulrajani​ 

Does @Prabakar Ammeappin​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?

We'd love to hear from you.

Thanks!

Sneha1594
New Contributor III

Done!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group