Hi @Ritesh-Dhumne ,
I'm assuming that you mistakenly named Free Edition as Community since you're using volumes which are not available in community edition.
I’m not sure if I’ve understood your approach correctly, but at first glance it seems incorrect - you can’t pass a DataFrame between tasks. What you can do is load all the files from the volume into a bronze table in Notebook1. You can use the special _metadata column to add information about the file_path from which each particular row originates. Here’s an example of how to use it:

Then, in Notebook2, you can apply your transformations based on this bronze table. You can count nulls, handle dirty data, and benefit from the fact that you can relate all these issues to a particular file, since this information is added to the bronze table through the _metadata special column.
From what I see you're in a learning process so I won't introduce the concept of autoloader which is pretty handy for ingestion of files 🙂