I have few jobs that run serverless. Currently there is only one task for each job. These tasks have multiple parameters, some of which are shared between the jobs, some are unique to each jobs.
Is it possible to define all the parameters for all the jobs in a centralised location and then reference it for each task?
job.yml:
tasks:
- task_key: my_key
spark_python_task:
python_file:my_path
parameters: ${var.default_serverless_parameters}
environment_key: my_environment_key
serverless_parameters.yml
variables:
default_serverless_parameters:
description: "My list of parameters"
type: complex
default:
- --param1
- ${value1}
- --param2
- ${value2}
databricks bundle validate accepts the above but when I try to run it, I get the following error:
TypeError: spark_python_task.parameters must be a list