cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
missing-QuestionPost
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

maxFilesPerTrigger not working in bronze to silver layer

sanjay
Valued Contributor II

Hi,

I am using Matillion architecture where autoloader picks files from AWS S3 and saves in delta lake. Next layer picks the changes from delta lake and does some processing. I am able to set batch size in autoloader and its working. But in bronze to silver layer, unable to set batch limit, its picking all files in one go. Here is my code from bronze to silver layer..

(spark.readStream.format("delta")

.option("useNotification","true")

.option("includeExistingFiles","true")

.option("allowOverwrites",True)

.option("ignoreMissingFiles",True)

.option("maxFilesPerTrigger", 100)

.load(bronze_path)

.writeStream

.option("checkpointLocation", silver_checkpoint_path)

.trigger(processingTime="1 minute")

.foreachBatch(foreachBatchFunction)

.start()

)

Appreciate any help.

Regards,

Sanjay

3 REPLIES 3

Anonymous
Not applicable

Hi @Sanjay Jainโ€‹ 

Great to meet you, and thanks for your question!

Let's see if your peers in the community have an answer to your question. Thanks.

Lakshay
Databricks Employee
Databricks Employee

Hi @Sanjay Jainโ€‹ , Could you try using a fresh checkpoint location if not already tried? Also, could you please check the logs what is the size of the micro batch it is currently processing?

sanjay
Valued Contributor II

Hi Lakshay,

I tried with new checkpoint location but still its not working. Its taking whole data in one go and not respecting batch size mentioned in code.

Regards,

Sanjay

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group