Asset Bundle: inject job start_time parameter
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
Hey!
I'm deploying a job with databricks asset bundles.
When the pyspark task is started on a job cluster, I want the python code to read the job start_time and select the right data sources based on that parameter.
Ideally, I would read the parameter from sys.argv in the python task code:
import sys
def run():
print(sys.argv)
In my databricks bundle yaml definition, I have this block:
resources:
jobs:
job1:
name: "Job 1"
tasks:
- task_key: task1
job_cluster_key: generic_cluster
depends_on: []
python_wheel_task:
package_name: my_package
entry_point: my_module.run
libraries:
- whl: ${var.PACKAGE_ARTIFACT_LOCATION}
job_clusters:
- job_cluster_key: generic_cluster
new_cluster:
...
parameters:
- name: start_time
default: ${job.start_time.iso_datetime}
The bundle validates successfully, but on bundle deploy I get an error:
Error: Reference to undeclared resource
on bundle.tf.json line 50, in resource.databricks_job.job1.parameter[0]:
50: "default": "${job.start_time.iso_datetime}",
A managed resource "job" "start_time" has not been declared in the root
module.
How to correctly define the dynamic job start_time variable?
1 REPLY 1
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
The databricks cli version is Databricks CLI v0.239.1

