โ08-13-2025 09:23 AM
Hi there,
We are using Azure Storage Account and their SFTP feature. We have 3rd parties we work with that submit reports to us via SFTP into Azure Blob Storage. We have setup a File Trigger for that external location.
Everything works fine if you upload a file manually to the Storage Container. However if a file gets added via SFTP nothing happens. We have tried adding the events SftpCommit and SftpCreate to the Azure Storage Account but it doesn't make a difference.
Does anyone know why this isn't supported or if we can somehow get this working?
Thank you.
โ08-14-2025 03:37 AM - edited โ08-14-2025 03:39 AM
HI @akdav ,
Ok, I've recreated your scenario (more or less). So I enabled SFTP on my storage account and created home directory for my SFTP user:
Then in databricks I've enabled file events for an external location (which is recommended). To enable it, you need to make sure that your Unity Catalog Access Connector has appropriate permissions. So check if your managed identity has following roles:
Manage external locations - Azure Databricks | Microsoft Learn
Next, you need to enable file events for your external location. Go to Unity Catalog and click external locations.
Select the one for which you want to enable file event.
Now, once you are inside external location - click edit button:
Then tick Enable file events and click Auto-fill access connector ID
Now you can configure you job with file arrival trigger. If everything went smoothly you should see event grid system topic and event subscription created for you by Databricks in your storage account
I've tested it by uploading some file using WinSCP and file arrival trigger worked like a charm ๐
3 weeks ago
Hi Dimitry.
I actually never got this working using file events. How I got it working was turning off file events for this external_location. Then still use a file trigger which uses polling and can support up to 10K files.
This was the only way to get it working. However luckily these containers in my case the 10K limit wont be an issue.
Hope this helps.
โ08-13-2025 10:37 AM - edited โ08-13-2025 10:38 AM
Hi @akdav ,
Could you check if your account has enabled hierarchical namespace? According to documentation it's required for SFTP events.
That would also explained why it works when you upload files manually, but not when you use SFTP.
"SFTP events These events are triggered if you enable a hierarchical namespace on the storage account, and clients use SFTP APIs. For more information about SFTP support for Azure Blob Storage, see SSH File Transfer Protocol (SFTP) in Azure Blob Storage."
โ08-14-2025 01:02 AM
Hi @szymon_dybczak
Thanks for the suggestion - it is enabled
It works fine when I upload blobs manually or via API. But doesn't trigger via the SFTP user route.
This flow is also linked to Power Automate to move files into OneDrive. That works fine so I guess this might just be an event which Databricks don't listen to? Do they provide documents anywhere on the file events they do monitor to your knowledge?
Thanks
โ08-14-2025 03:37 AM - edited โ08-14-2025 03:39 AM
HI @akdav ,
Ok, I've recreated your scenario (more or less). So I enabled SFTP on my storage account and created home directory for my SFTP user:
Then in databricks I've enabled file events for an external location (which is recommended). To enable it, you need to make sure that your Unity Catalog Access Connector has appropriate permissions. So check if your managed identity has following roles:
Manage external locations - Azure Databricks | Microsoft Learn
Next, you need to enable file events for your external location. Go to Unity Catalog and click external locations.
Select the one for which you want to enable file event.
Now, once you are inside external location - click edit button:
Then tick Enable file events and click Auto-fill access connector ID
Now you can configure you job with file arrival trigger. If everything went smoothly you should see event grid system topic and event subscription created for you by Databricks in your storage account
I've tested it by uploading some file using WinSCP and file arrival trigger worked like a charm ๐
โ08-20-2025 09:47 AM
Hi,
Thanks for your input. I got this working by creating a test similar to you. However the original Storage Account is still having issues. I am still trying to identify what the issue is.
However it is safe to say that SFTP events are supported.
3 weeks ago
Hi @akdav
I have the same issue as yours and azure support confirmed SFTP events are not supported: Storage account file event is not triggered for Azure Databricks on SFTP upload - Microsoft Q&A
Also, setup by Databricks is not subscribing to SFTP events. Only to Blob ones, which is triggered when you use data studio / Azure UI etc.
Does WinCSP upload by to the storage account by SFTP or it uses HTTPS? I can't get how these are triggered for you, such a headache to discover.
3 weeks ago
Hi Dimitry.
I actually never got this working using file events. How I got it working was turning off file events for this external_location. Then still use a file trigger which uses polling and can support up to 10K files.
This was the only way to get it working. However luckily these containers in my case the 10K limit wont be an issue.
Hope this helps.
3 weeks ago - last edited 3 weeks ago
--removed
3 weeks ago
Better question, what do you mean under polling?
I can have only these triggers:
Continuous just runs the job non stop, scheduled on the intervals, and file trigger is that what's not working.
So did you use some external crutches to poll?
3 weeks ago
Hi Dimitry,
No the file trigger polling is managed by Databricks. You can use it with serverless.
3 weeks ago
Hi Dimitry,
You need to go to the external_location. Then turn off file events for that external_location.
Then you still select File Trigger. It will then evaluate the external_location.
It will give you a message that you can only track up to 10k Files.
3 weeks ago
3 weeks ago
Oh mate, you just made my day, night and the next few day altogether.
Yes, it works! It does!!!
3 weeks ago
Happy that worked for you!
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now