โ04-29-2024 07:09 PM
I am trying to schedule some jobs using workflows and leveraging dynamic variables. One caveat is that when I try to use {{job.start_time.[iso_date]}} it seems to be defaulted to UTC, is there a way to change it?
โ04-30-2024 06:08 AM
Try to use this configuration let me know your finding here
spark configuration page- https://spark.apache.org/docs/latest/configuration.html
โ05-02-2024 07:33 AM
The configurations did not work. Once the job cluster starts, it leverages the input which is already set. I did not try to set this in the init_script, but I believe it would be the same behavior.
โ04-30-2024 07:19 AM
Hi, all the dynamic values are in UTC (documentation).
Maybe you can use the code like the one presented below + pass the variables between tasks (see Share information between tasks in a Databricks job) ?
%python
from datetime import datetime, timedelta
# Get the current date
current_date = datetime.now()
# Subtract one day to get the previous day or make any other shift
previous_day = current_date - timedelta(days=1)
# Format the date as a string in the desired format if needed
previous_day_str = previous_day.strftime('%Y-%m-%d')
# Now you can use previous_day_str as a dynamic variable in your notebook logic
โ05-02-2024 07:35 AM
This does work, I think I will leverage something like this even thought it is a bit hacky.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now