If you have different folders for each of your source tables, you can leverage python loops to naturally iterate over the folders.
To do this, you need to create a create_pipeline function that has table_name, schema, path as your parameters. Inside this function, you have your DLT function that creates your raw or bronze table which your parameter will be use.
You can then simply call the main function and loop it for each folder you have in your path using dbutils.
for folder in dbutils.fs.ls("<your path>"):
table_name = folder.name[:-1]
create_pipeline(table_name, schema, path)