05-19-2025 09:52 PM
I am trying to make a reusable workflow where I can run a merge script for any number of tables. The idea is I tell the workflow the table name and/or path to it and it can reference that in the file path field. The simplified idea is below:
resources:
jobs:
test_workflow:
name: test_workflow
description: some workflow
tasks:
- task_key: run_sql_file
sql_task:
file:
path: ${workspace.file_path}/{{job.parameters.sql_directory}}/{{job.parameters.table_name}}.sql
source: WORKSPACE
warehouse_id: ${var.serverless_sql_warehouse_ID}
queue:
enabled: true
parameters:
- name: sql_directory
default: "path/to/sql"
- name: table_name
default: "" # this one I change when running the task or when calling this workflow from another workflow
However this doesn't work. ${workspace.file_path} is able to be formatted as expected but {{job.parameters.sql_directory}} and {{job.parameters.table_name}} are not able to be formatted, they don't dynamically update and they stay written exactly as is with the curly braces.
I'm wondering if there is anyway around this or to make this happen?
05-20-2025 03:04 AM
From what I know, you can't mix ${} and {{}} variable styles in the SQL file path. A better approach would be to pass the full SQL file path as a single job parameter, and update its value when triggering the job.
05-20-2025 06:55 AM - edited 05-20-2025 06:59 AM
Thanks @Bhaskar27 , unfortunately this doesn't work
path: {{job.parameters.merge_file_path}}I get a yaml error when I try to deploy because "key is not a scalar". So it seems like it is not possible to use this kind of dynamic variable for the sql_task file path.
Edit: just in case this issue was fixed in a newer version, I upgraded my CLI to 0.252.0 (latest version right now) and I get the same error.
06-17-2025 11:35 PM
@kenmyers-8451 were you able to solve this? I'm having a similar issue.
06-17-2025 11:56 PM
Oh, never mind, I got it working. Just using single quotes around the {{ }} part solves it (I guess double quotes would work as well.)
I think I tried this yesterday but probably run into another isssue with dashes in task names: https://community.databricks.com/t5/data-engineering/dab-for-each-task-passing-task-values/td-p/1210...
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now