โ02-11-2023 08:21 PM
When i run my command for readstream using .option("cloudFiles.useNotifications", "true") it start reading the files from Azure blob (please note that i did not provide the configuration like subscription id , clint id , connect string and all while reading )
df = (
spark.readStream.format("cloudFiles")
.option("cloudFiles.format", "csv")
.option("cloudFiles.useNotifications", "true")
.option("header", True)
.load(source_data_loc)
)
now when i start writing it using below code
df.writeStream.format("delta").option("checkpointLocation", checkpoints_loc).outputMode("append").start(target_data_loc)
it started giving me error like Please provide the subscription ID with `cloudFiles.subscriptionId`
Now to resolve these issue i gave all these info as mention below while using readstream
cloudFilesConf = {
"cloudFiles.subscriptionId": subscriptionId,
"cloudFiles.clientId": spn_client_id,
"cloudFiles.connectionString": QueueSASKey,
"cloudFiles.clientSecret": spn_secret_name,
"cloudFiles.tenantId": spn_tenant_id,
"cloudFiles.resourceGroup": ResourceGroup_name,
"cloudFiles.schemaLocation": schema_loc,
#"cloudFiles.useNotifications": "true"
}
but when i am trying to run now it is giving me "option() got an unexpected keyword argument 'cloudFiles.subscriptionId'" this error so not sure where the issue is
Please suggest
โ02-22-2023 04:22 PM
โ02-22-2023 02:27 PM
Hi,
I would like to share the following docs that might be able to help you with this issue. https://docs.databricks.com/ingestion/auto-loader/file-notification-mode.html#required-permissions-f... you need to set the right permission and define all the settings to be able to consume data.
โ02-22-2023 04:22 PM
thanks you i found the issue and it is resolved now thanks
โ03-17-2023 01:01 AM
Please let us know how the issue got resolved
โ06-04-2024 02:23 AM
you need to create event grid subscriptions and queues should already provisioned and it should be part of the CI/CD process
โ03-20-2023 11:08 PM
Hi Abhradwip, the issue is resolved.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group