You cannot directly access a dynamic value like ${job.start_time.iso_datetime} in a Databricks Asset Bundle YAML for job parametersโDatabricks jobs do not inject special variables (like the job runโs start time) automatically into job parameters at runtime. The error you see (โReference to undeclared resourceโ) is because Databricks parameter defaults expect static values or substitutions from declared variables, not runtime metadata.
How Job Runtime Parameters Work in Databricks Asset Bundles
-
The parameters block in Bundles is for static configuration at deploy time.
-
There is no automatic โjob start_timeโ variable you can inject at deploy or submit timeโthe value is not known until the job actually runs.
Ways to Access the Job Runโs Start Time in Task Code
1. Use Databricks Jobs Runtime Context (Recommended)
When your job runs, Databricks injects a context file into the cluster under /databricks/driver/databricks-job-context.json with run metadataโincluding the start time, job id, run id, etc.
You can read this in Python:
import json
def run():
with open("/databricks/driver/databricks-job-context.json") as f:
context = json.load(f)
print(context)
start_time = context["run"]["start_time"] # This is a Unix timestamp in ms
-
The exact structure may varyโinspect the file to confirm the keys.
2. Pass the Datetime as a Parameter at Submit Time
If you submit jobs via API or CLI, you can programmatically pass the current time as a parameter:
databricks jobs run-now --job-id <job_id> --notebook-params '{"start_time":"2025-11-05T11:53:00Z"}'
But Bundles do not let you specify a function as a parameter default.
3. Use sys.argv in Wheels
If you want to read arguments from sys.argv in a python_wheel_task, you must pass them as parameters: at submit timeโagain, no dynamic options exist for runtime metadata in Bundles YAML.
Example: Reading Context in a Python Wheel Task
-
Donโt try to define ${job.start_time...} as a parameter in Bundles.
-
In your Python entry point, read databricks-job-context.json as above.
-
Use that value to select data sources as required.
Key Takeaways
-
There is no ${job.start_time} variable available at job definition time.
-
To get the runtime value, read the Databricks job context JSON on the cluster.
-
Do not try to define the job start time as a parameter in Bundles YAMLโremove that block or replace it with a placeholder if needed for other uses.
References
-
Databricks job context documentation (search: "databricks-job-context.json job context").
-
More on job parameterization and best practices (search: "databricks job parameters best practices").
In summary:
Remove the parameters: start_time block from your YAML, and in your Python wheel task code, load /databricks/driver/databricks-job-context.json to get the run's actual start time and more.