Best practices for working with external locations where many files arrive constantly
I have an Azure Function that receives files (not volumes) and dumps them to cloud storage. One-five files are received approx. per second. I want to create a partitioned table in Databricks to work with. How should I do this? E.g.: register the cont...
- 1522 Views
- 0 replies
- 0 kudos