Do Databricks workflows support creating a workflow with a dynamic number of tasks?
For example, let's say we have a DAG like this:
T1 -> T2(1) ->
T2(2) ->
..... -> T3
T2(n-1) ->
T2(n) ->
In this case task 1 (T1) executes first and creates 1...n tasks (T2) that can execute in parallel. Then once all of those T2 tasks finish, a T3 task can run.
Here you don't know up front how many T2 tasks will exist up front because they rely on the output of T1 (which will change). In my specific case, T1 will generate n rows with each row providing the parameter needed to query a different server in T2.