cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Azure File Share Connect with Databricks

Upendra_Dwivedi
Contributor

Hi All,

I am working on a task where i need to access Azure File Share from Databricks and move files from there to storage account blob container.
I found one solution which is to use azure-file-share python package and it needs SAS token. But i don't know if it is recommended and reliable solution to move to production.
I am thinking if it is possible to use service principle instead of SAS for authentication and read the data or mount the file share like we mount blob container.

It would be best to mount it or service principle. i guess 🙂

Please suggest some good approach to ingest data from file share to blob container.

1 REPLY 1

Omerabbasi
New Contributor II

I think you are on the right track but getting a bit more granular. Once the Azure File Share is mounted,use Spark to move the data from the source path and write it to a blob container. 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now