Too many small files in the "landing area"
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā03-06-2025 07:17 AM
Hello everyone,
Iām currently working on a setup where my unprocessed real-time data arrives as .json files in Azure Data Lake Storage (ADLS). Every x minutes, I use Databricks Autoloader to pick up the new data, run my ETL transformations, and store the cleaned data in Databricks tables. This works fine for moderate sources, but for certain high-volume sources that generate millions of small JSON files per day, Iām hitting the classic ātoo many small filesā issue. The overhead of scanning and listing so many files significantly impacts my processing times.
Iāve seen suggestions to periodically merge or aggregate these small files into larger ones, but it feels like thatās just an additional step that also suffers from large file-listing overhead. Iām wondering if thereās a more direct workaround or best practice to:
Reduce the overhead when reading or listing these files.
- Possibly store data in a more efficient format (Parquet/Delta) at landing time, if thatās feasible.
- Use Autoloader features (like cloudFiles.mergeSchema or cloudFiles.useNotifications) in a more optimal way for large volumes.
Has anyone successfully tackled a similar scenario? Any recommendations on how to handle a massive number of small files without incurring huge overhead on each ETL cycle would be greatly appreciated!
Thank you in advance for your insights.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā03-10-2025 08:03 PM
Hi @Jorge3
Since you mentioned the "cloudFiles.useNotifications" option, I assume you know AutoLoader's File Detection Mode. It should be the best solution to your situation. Have you tried it already and encountered an issue? If so, please let us know the details.
This video shows a detailed step-by-step guide for setting up Autoloader ingestion using file detection mode from an ADLS storage location. I found it helpful and hope it will be for you, too!

