Friday
Hi there,
If I understood correctly, Roland said output SQL task can be used as input to ForEach task in Workflows. I tried that and used the expression sqlTaskName.output.rows, but Databricks rejected that expression. Anyone know how to do that?
12 hours ago
Our internal teams has confirmed that this is currently not working on your side as this feature is currently in Private preview we will need to wait for some time until it is fully released.
Friday
Can you confirm if this are the steps being followed:
Create the SQL Task: Ensure your SQL task is correctly set up and produces the desired output. For example:
SELECT customer_name, market FROM example_customers;
Reference the SQL Task Output in ForEach Task:
{{ tasks.sqlTaskName.output.rows }}
.sqlTaskName
) matches the task key you have defined in your workflow.Configure the ForEach Task:
inputs
field of the ForEach task to {{ tasks.sqlTaskName.output.rows }}
.{
"task_key": "process_customers_iteration",
"parameters": {
"customer_name": "{{ input.customer_name }}",
"market": "{{ input.market }}"
},
"sql_task": {
"file": {
"path": "/path/to/your/sql/file.sql"
}
}
}
Saturday
Hi there,
Thanks for looking at this. I followed your instructions to the letter, but Databricks refused to accept that dynamic expression for the input of foreach task, see the screenshot.And output.rows expression is not among dynamic expressions offered by 'Inputs' box.Also, I can't even save that setting for ForEach task, let alone to execute it. Please advise.
yesterday
May I know from which blog or webinar did you get the information provided in the original post?
yesterday
I got it from this:
What's New in Databricks Workflows - with Live Demos!
He's demoing haw sql task output can be consumed at around 9-th min
yesterday
I am asking internally on this, will get back to you once I have more details
12 hours ago
Our internal teams has confirmed that this is currently not working on your side as this feature is currently in Private preview we will need to wait for some time until it is fully released.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group