cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Autoloader with filenotification

kulkpd
Contributor

I am using DLT with filenotification and DLT job is just fetching 1 notification from SQS queue at a time. My pipeline is expected to process 500K notifications per day but it running hours behind. Any recommendations?

spark.readStream.format("cloudFiles")
.option("cloudFiles.schemaLocation","/mnt/abc/")
.option('cloudFiles.format', 'json')
.option('cloudFiles.inferColumnTypes', 'true')
.option('cloudFiles.useNotifications', True)
.option('skipChangeCommits', 'true')
.option('cloudFiles.backfillInterval', '3 hour')
.option('cloudFiles.maxFilesPerTrigger', 10000)


Logs:
NotificationFileEventFetcher: [queryId =] Fetched 1 messages from cloud queue storage.
NotificationFileEventFetcher: [queryId =] Fetched 1 messages from cloud queue storage.
NotificationFileEventFetcher: [queryId =] Fetched 1 messages from cloud queue storage.
2 ACCEPTED SOLUTIONS

Accepted Solutions

ThankscloudFiles.fetchParallelism to 100 definitely helped to read more messages from SQS.

NotificationFileEventFetcher: [queryId = 111] Fetched 100 messages from cloud queue storage

View solution in original post

Kaniz
Community Manager
Community Manager

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 
 

View solution in original post

3 REPLIES 3

Rdipak
New Contributor II

Can you set this value to higher number and try

cloudFiles.fetchParallelism its 1 by default

ThankscloudFiles.fetchParallelism to 100 definitely helped to read more messages from SQS.

NotificationFileEventFetcher: [queryId = 111] Fetched 100 messages from cloud queue storage

Kaniz
Community Manager
Community Manager

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 
 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.