when attempting to load a large 800 MB CSV file
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā01-16-2025 02:04 AM
Hello everyone,
Iām facing an issue when attempting to load a large 800 MB CSV file from ADLS using Auto Loader. Unfortunately, the notebook crashes during the loading process. Has anyone experienced something similar or have any suggestions on how to handle this?
Any tips to resolve or troubleshoot this issue would be greatly appreciated.
Thanks in advance,
Violeta
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā01-16-2025 04:35 AM
Hi @violeta482yee,
Have you checked the resource availability of the compute attached to the notebook?
One solution could be: to usecloudFiles.maxBytesPerTrigger
Option: This option allows you to control the maximum number of bytes processed in each trigger. For example, setting it to 100m
will process up to 100 MB of data per trigger. This can help prevent the notebook from crashing due to memory overload
But it requires more investigation on the compute being used to determine the reason.

