- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2021 07:15 AM
Hi Everyone can someone help with creating custom queue for auto loader as
given here as default FlushwithClose event is not getting created when my data is uploaded to blob as given in azure DB docs
cloudFiles.queueName
The name of the Azure queue. If provided, the cloud files source directly consumes events from this queue instead of setting up its own Azure Event Grid and Queue Storage services. In that case, your
cloudFiles.connectionString
requires only read permissions on the queue.
- Labels:
-
Autoloader
-
Azure
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2021 07:43 AM
you need to setup notification service for blob/adls like here https://docs.databricks.com/spark/latest/structured-streaming/auto-loader-gen2.html#cloud-resource-m...
setUpNotificationServices will return queue name which later can be used in autoloader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2021 07:43 AM
you need to setup notification service for blob/adls like here https://docs.databricks.com/spark/latest/structured-streaming/auto-loader-gen2.html#cloud-resource-m...
setUpNotificationServices will return queue name which later can be used in autoloader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-13-2021 09:49 AM
thanks HubertDudek quick questions here.
1.if I create a queue based on particular event can I just point that queue to manager alone or do I have to add some custom message
as I see data in queue with flush event contains some kind of json

