cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

How does AutoLoader works when triggered via Azure Data Factory?

zll_0091
New Contributor III

Hi,

I am currently creating an AutoLoader in databricks and will be using ADF as an orchestrator.

I am quite confused how this will handle my data so please clarify if I misunderstood it.

First, I will run my ADF pipeline which includes an activity to call my AutoLoader notebook. Will it work like below or will it just process all the files in the folder once I run the ADF pipeline?

***I'm using option("includeExistingFiles", False) on my readStream

zll_0091_0-1721789223632.png

 

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Contributor

Auto loader will process files incrementally. Let's say you have a files in existing directory called /input_files

First time you run autoloader, it will read all files in that directory (unless you set an option includeExsistingFiles to false, like you did) and save information about what files has been read to checkpoint location. 

Next run will only load new files, because auto loader knows what has been loaded previously thanks to checkpoint

View solution in original post

4 REPLIES 4

szymon_dybczak
Contributor

Auto loader will process files incrementally. Let's say you have a files in existing directory called /input_files

First time you run autoloader, it will read all files in that directory (unless you set an option includeExsistingFiles to false, like you did) and save information about what files has been read to checkpoint location. 

Next run will only load new files, because auto loader knows what has been loaded previously thanks to checkpoint

I see. So even though my stream stops it can still identify the files that was processed using the info in checkpoint location?

Exactly, you got it right 😉

Kaniz_Fatma
Community Manager
Community Manager

Hi @zll_0091, Hi, Thank you for reaching out to our community! We're here to help you.

To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback not only helps us assist you better but also benefits other community members who may have similar questions in the future.

If you found the answer helpful, consider giving it a kudo. If the response fully addresses your question, please mark it as the accepted solution. This will help us close the thread and ensure your question is resolved.

We appreciate your participation and are here to assist you further if you need it!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group