cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Free Trial Help
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results for 
Search instead for 
Did you mean: 

when attempting to load a large 800 MB CSV file

violeta482yee
New Contributor

Hello everyone,

Iā€™m facing an issue when attempting to load a large 800 MB CSV file from ADLS using Auto Loader. Unfortunately, the notebook crashes during the loading process. Has anyone experienced something similar or have any suggestions on how to handle this?

Any tips to resolve or troubleshoot this issue would be greatly appreciated.

 

Thanks in advance,

Violeta

1 REPLY 1

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @violeta482yee,

Have you checked the resource availability of the compute attached to the notebook?

One solution could be: to usecloudFiles.maxBytesPerTriggerOption: This option allows you to control the maximum number of bytes processed in each trigger. For example, setting it to 100m will process up to 100 MB of data per trigger. This can help prevent the notebook from crashing due to memory overload

But it requires more investigation on the compute being used to determine the reason.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityā€”sign up today to get started!

Sign Up Now