cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

How does AutoLoader works when triggered via Azure Data Factory?

zll_0091
New Contributor III

Hi,

I am currently creating an AutoLoader in databricks and will be using ADF as an orchestrator.

I am quite confused how this will handle my data so please clarify if I misunderstood it.

First, I will run my ADF pipeline which includes an activity to call my AutoLoader notebook. Will it work like below or will it just process all the files in the folder once I run the ADF pipeline?

***I'm using option("includeExistingFiles", False) on my readStream

zll_0091_0-1721789223632.png

 

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Contributor III

Auto loader will process files incrementally. Let's say you have a files in existing directory called /input_files

First time you run autoloader, it will read all files in that directory (unless you set an option includeExsistingFiles to false, like you did) and save information about what files has been read to checkpoint location. 

Next run will only load new files, because auto loader knows what has been loaded previously thanks to checkpoint

View solution in original post

3 REPLIES 3

szymon_dybczak
Contributor III

Auto loader will process files incrementally. Let's say you have a files in existing directory called /input_files

First time you run autoloader, it will read all files in that directory (unless you set an option includeExsistingFiles to false, like you did) and save information about what files has been read to checkpoint location. 

Next run will only load new files, because auto loader knows what has been loaded previously thanks to checkpoint

I see. So even though my stream stops it can still identify the files that was processed using the info in checkpoint location?

Exactly, you got it right 😉

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group