cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Change {{job.start_time.[iso_date]}} Timezone

QuantumFries
New Contributor II

I am trying to schedule some jobs using workflows and leveraging dynamic variables. One caveat is that when I try to use {{job.start_time.[iso_date]}} it seems to be defaulted to UTC, is there a way to change it?

4 REPLIES 4

Aviral-Bhardwaj
Esteemed Contributor III

AviralBhardwaj_0-1714482356500.png

Try to use this configuration let me know your finding here

spark configuration page- https://spark.apache.org/docs/latest/configuration.html

The configurations did not work. Once the job cluster starts, it leverages the input which is already set. I did not try to set this in the init_script, but I believe it would be the same behavior. 

artsheiko
Valued Contributor III
Valued Contributor III

Hi, all the dynamic values are in UTC (documentation).

Maybe you can use the code like the one presented below + pass the variables between tasks (see Share information between tasks in a Databricks job) ?

%python
from datetime import datetime, timedelta

# Get the current date
current_date = datetime.now()

# Subtract one day to get the previous day or make any other shift
previous_day = current_date - timedelta(days=1)

# Format the date as a string in the desired format if needed
previous_day_str = previous_day.strftime('%Y-%m-%d')

# Now you can use previous_day_str as a dynamic variable in your notebook logic

 

This does work, I think I will leverage something like this even thought it is a bit hacky.