Resolved! Azure Data Factory: allocate resources per Notebook
I'm using Azure Data Factory to create pipeline of Databricks notebooks, something like this:[Notebook 1 - data pre-processing ] -> [Notebook 2 - model training ] -> [Notebook 3 - performance evaluation].Can I write some config file, that would allow...
- 1105 Views
- 2 replies
- 4 kudos
Latest Reply
I understand that, in your case, auto-scaling will take too much time.The simplest option is to use a different cluster for another notebook (and be sure that the previous cluster is terminated instantly).Another option is to use REST API 2.0/cluster...
- 4 kudos