Strcutured Streaming with queue in separate storage account
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2025 12:04 AM - edited 02-14-2025 12:05 AM
Hello,
we are running a structured streaming job which consumes zipped Json files that arrive in our Azure Prod storage account. We are using AutoLoader and have set up an Eventgrid Queue which we pass to the streaming job using cloudFiles.queueName. We also specify a glob pattern which we pass to the load() function.
My goal is to have the same job running in our Test environment but consuming files that arrive in Prod. We do not want to copy over all files to Test but want to be able to test our jobs with all files that are arriving.
My problem is that if I pass the glob pattern on the Prod storage account, I get the following error.
[STREAM_FAILED] Query [id = xxx, runId = xxx] terminated with exception: The queue prod-queue-on-test doesn't exist SQLSTATE: XXKST
But if I pass the glob pattern for the Test storage account my Job does not process any files, probably because the messages from the queue do not match the glob pattern.
We're running more or less the following code.
spark.readStream.format("cloudFiles")
.option("cloudFiles.format", data_format)
.option("cloudFiles.schemaLocation", autoloader_schema_path)
.option("checkpointLocation", stream_checkpoint_path)
.option("cloudFiles.useNotifications", "true")
.options(
"cloudFiles.subscriptionId": subscription_id,
"cloudFiles.clientId": client_id,
"cloudFiles.clientSecret": kv_secret_key,
"cloudFiles.tenantId": tenant_id,
"cloudFiles.resourceGroup": f"rg-{stage}-xxx",
"cloudFiles.queueName": queue_name,
)
.schema(input_schema)
.load(path_glob_prefix, pathGlobFilter=path_glob_suffix)

