yesterday
Hi all
I got a job trigger by a file event on the external location.
The location and jobs triggers are working fine when uploading file via Azure Portal.
I need SFTP trigger, so I went into the event grid, found subscription for the storage account on question and modified filters by adding SftpCommit and SftpRename (sometimes 3rd party uploads .part file and renames).
I'm uploading a sample file manually to SFTP and I see the event trigger in the event grid that is displayed as matched event.
Next second I see a message in the Azure Storage account queue (queue is automatically created by Bricks for this external location). Then the message disappears.
Job... nothing.
2nd part of the riddle.
Files arrive ONLY by SFTP.
Job does trigger. But it triggers completely out of sync with the events.
I check events in the storage log, local time zone:
And here are the job runs:
They are by hours out of sync with the queue.
The job is configured to be queued with 1 concurrent run. A cluster is dedicated to the job.
Anyone seen this before? What is happening?
yesterday
Update
Appears that even uploading via UI does not trigger it any more. It did trigger weeks ago.
I have just uploaded a file in UI and saw this message in the storage queue:
{"topic":"/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.Storage/storageAccounts/xxx","subject":"/blobServices/default/containers/kumho/blobs/stock/test.csv","eventType":"Microsoft.Storage.BlobCreated","id":"d85d530f-401e-0043-7d65-1ca903060b80","data":{"api":"PutBlob","requestId":"d85d530f-401e-0043-7d65-1ca903000000","eTag":"0x8DDEA7C98CE433C","contentType":"text/csv","contentLength":34300,"blobType":"BlockBlob","accessTier":"Default","blobUrl":"https://xxx.blob.core.windows.net/kumho/stock/test.csv","url":"https://xxx.blob.core.windows.net/kumho/stock/test.csv","sequencer":"00000000000000000000000000036fc90000000000156c90","identity":"$superuser","storageDiagnostics":{"batchId":"1616f433-a006-005b-0065-1c7664000000"}},"dataVersion":"","metadataVersion":"1","eventTime":"2025-09-02T23:58:21.6954353Z"}
The job didn't trigger at all.
yesterday
The UI issue appeared to be driven by permissions. The Job "Run By" user should have access to the external location. Once given it starts to trigger.
SFTP does not trigger regardless. Only HTTP protocol does.
SFTP events coming through and create a message.
Databricks looks on the message and discards it. It seems to be looking into very specific events, so not every message in the storage account queue is respected.
Can someone advise on that if you happen to sort SFTP trigger
yesterday
Hi @Dimitry ,
I was investigating similar case in other thread and in my case it worked as expected. Here's a reply I posted back then, try to follow all steps from scratch:
Solved: Job File Event Trigger not firing for SftpCommit a... - Databricks Community - 128356
"Ok, I've recreated your scenario (more or less). So I enabled SFTP on my storage account and created home directory for my SFTP user:
Then in databricks I've enabled file events for an external location (which is recommended). To enable it, you need to make sure that your Unity Catalog Access Connector has appropriate permissions. So check if your managed identity has following roles:
Manage external locations - Azure Databricks | Microsoft Learn
Next, you need to enable file events for your external location. Go to Unity Catalog and click external locations.
Select the one for which you want to enable file event.
Now, once you are inside external location - click edit button:
Then tick Enable file events and click Auto-fill access connector ID
Now you can configure you job with file arrival trigger. If everything went smoothly you should see event grid system topic and event subscription created for you by Databricks in your storage account
I've tested it by uploading some file using WinSCP and file arrival trigger worked like a charm 🙂 "
yesterday
I have a bunch of external locations created by bricks out of the box. So this is enabling file events etc., all checkboxes ticked. None of them trigger when I upload a file via SFTP.
Can you query logs on your storage account and see what protocol and what events WinCSP generated?
In my case events are SftpCommit and SftpRename.
Note that when you create "enable file events" the automatic setup in the Event Grid looks following:
None of these are generated by SFTP when just uploading a file (not sure about deletion etc.)
In this case the storage account queue for the external location does not receive a message.
I added SftpCommit to the list of the filters in the event grid. Message arrives. Databricks processes it and ...nothing.
I'm interested to know how it was triggered in your case and what protocol
3 hours ago
Hi @Dimitry ,
Unfortunately, I don't have right now access to my company's visual studio subscription so I can't recreate it. But maybe you've found a bug? If you set up SftpCommit and SftpCreate events in system topics advance filters and event grid properly forwards events to storage queue and then if those events are consumed by file arrival trigger but job not started then for me it looks like a bug.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now