- 5616 Views
- 5 replies
- 1 kudos
When i run my command for readstream using .option("cloudFiles.useNotifications", "true") it start reading the files from Azure blob (please note that i did not provide the configuration like subscription id , clint id , connect string and all while...
- 5616 Views
- 5 replies
- 1 kudos
Latest Reply
Hi,I would like to share the following docs that might be able to help you with this issue. https://docs.databricks.com/ingestion/auto-loader/file-notification-mode.html#required-permissions-for-configuring-file-notification-for-adls-gen2-and-azure-b...
4 More Replies
- 3167 Views
- 1 replies
- 0 kudos
Hi All,I have a few streaming jobs running but we have been facing an issue related to messaging. We have multiple feeds within the same root rolder i.e. logs/{accountId}/CloudWatch|CloudTrail|vpcflow/yyyy-mm-dd/logs. Hence, the SQS allows to setup o...
- 3167 Views
- 1 replies
- 0 kudos
Latest Reply
@Fernando Messas :Yes, you can configure Autoloader to consume messages from an SQS queue using EventBridge. Here are the steps you can follow:Create an EventBridge rule to filter messages from the SQS queue based on a specific criteria (such as the...
- 3083 Views
- 2 replies
- 0 kudos
I configured ADLS Gen2 standard storage and successfully configured Autoloader with the file notification mode.In this documenthttps://docs.databricks.com/ingestion/auto-loader/file-notification-mode.html"ADLS Gen2 provides different event notificati...
- 3083 Views
- 2 replies
- 0 kudos
Latest Reply
Hi, @Chris Konsur. You do not need anything with the FlushWithClose event REST API that is just the event type that we listen to. As for backfill setting, this is for handling late data or late event that are being triggered. This setting largely de...
1 More Replies
- 3283 Views
- 3 replies
- 10 kudos
I'm using Auto Loader in a SQL notebook and I would like to configure file notification mode, but I don't know how to retrieve the client secret of the service principal from Azure Key Vault. Is there any example notebook somewhere? The notebook is p...
- 3283 Views
- 3 replies
- 10 kudos
Latest Reply
Hi @Magnus Johannesson , you must use the Secrets utility (dbutils.secrets) in a notebook or job to read a secret.https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-utils#dbutils-secretsHope it helps!
2 More Replies
- 2609 Views
- 2 replies
- 3 kudos
First, I tried to configure Autoloader in File notification mode to access the Premium BlobStorage 'databrickspoc1' (PREMIUM , ADLS Gen2). I get this Error: I get this errorcom.microsoft.azure.storage.StorageException: I checked my storage account->N...
- 2609 Views
- 2 replies
- 3 kudos
Latest Reply
When you created a premium account, have you chosen "Premium account type" as "File shares"? It should be "Block blobs".
1 More Replies