cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Databricks Free Trial Help
Engage in discussions about the Databricks Free Trial within the Databricks Community. Share insights, tips, and best practices for getting started, troubleshooting issues, and maximizing the value of your trial experience to explore Databricks' capabilities effectively.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

when attempting to load a large 800 MB CSV file

violeta482yee
New Contributor

Hello everyone,

Iโ€™m facing an issue when attempting to load a large 800 MB CSV file from ADLS using Auto Loader. Unfortunately, the notebook crashes during the loading process. Has anyone experienced something similar or have any suggestions on how to handle this?

Any tips to resolve or troubleshoot this issue would be greatly appreciated.

 

Thanks in advance,

Violeta

1 REPLY 1

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @violeta482yee,

Have you checked the resource availability of the compute attached to the notebook?

One solution could be: to usecloudFiles.maxBytesPerTriggerOption: This option allows you to control the maximum number of bytes processed in each trigger. For example, setting it to 100m will process up to 100 MB of data per trigger. This can help prevent the notebook from crashing due to memory overload

But it requires more investigation on the compute being used to determine the reason.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group