cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

File Trigger using azure file share in unity Catalog

angel_ba
New Contributor II

Hello,

 

I have got the unity catalog eanbled in my workspace. the file srae manually copied by customers in azure file share(domain joint account, wabs) on adhoc basis. I would like to add a file trigger on the job so that as soon as file arrives in the file share it should get copied to my container(abfss) and triggers the job. But I am not able to mount azure file share but only abfss. How can I resolve it?

2 REPLIES 2

Diego33
New Contributor II

Hi Kaniz. I have been trying to follow those steps for the past 2 days, multiple work around in python to try to mount SBM on databricks : 3. Mounting Azure File Share (WABS):  5. Copy the provided script. (MSFT Azure script proveded in Powershell or Bash, it does not work on Python Notebook, I tried multiple work around without success, I was not able to use Databricks Notebook to mount the Azure File Share. can you help?

Screenshot_3.png

adriennn
Contributor II

@Diego33 Kaniz is half-bot half-human, but unfortunately not gracing us with "sorry for the confusion" responses.

After a quick search, I thought that maybe there's a possiblity to find use the web terminal and do a manual mount with the bash script that Microsoft provides. But all I get is mount:permission denied. I have a use case on my hands with File Share so I thought I'd give it a try.

It's probably easier to work with the Azure Python SDK to access data from azure storage fileshares.

And @angel_ba, there are vague references to a File Share trigger becoming available in Azure Function but it hasn't materialized and is not referenced as trigger, and it certainly isn't available in Databricks. That said, if I had to build a system that must pick up changed files in a File Share, I would use powershell to list the files (maybe have a landing area, to prevent having too much stuff to sift through), then you get the metadata containing the field "LastModified" with the above command, and then you can decide what to do. Or simply move the file from the File Share to blob storage and then it solves all the other issues.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group