Hi @rameshybr ,
One way to do that is to get files from a given location using dbutils.fs.ls and then iterate over this list one file after another.
files = dbutils.fs.ls("abfss://conatiner@account_name.dfs.core.windows.net/folder")
for file in files:
df = spark.read.json(file.path)
#implement further logic