Hi everyone,
I have a workflow involving two notebooks: Notebook A and Notebook B. At the end of Notebook A, we generate a variable number of files, let's call it N. I want to run Notebook B for each of these N files.
I know Databricks has a Foreach task that can iterate over a list of items.
Here's what I've tried so far
output_dir_paths = [<list of paths>]
dbutils.jobs.taskvalues.set(key="notebook_A_output_paths", value=output_dir_paths)
ForEach Loop:

The Task:

In Notebook B, I'm attempting to read each path like this:
path = dbutils.widgets.get("single_batch_file")
Could someone please help me correct the code to pass the list of paths from Notebook A, iterate over each path, and send it to Notebook B?