I wanted to setup Autoloader to process files from Azure Data Lake (Blob) automatically whenever new files arrive. For this to work, I wanted to know if AutoLoader requires that the cluster is on all the time.
How do you connect to Azure Databricks instance from another Databricks instance? I needed to access (database) Views created in a Databricks instance from a Pyspark notebook running in another Databricks instance. Appreciate if anyone has any sample...
Thanks @Kaniz Fatma for the response. Unfortunately I dont have a set frequency for the arrival of files. It is very adhoc. Let me ask you this question. Is it possible for event grid to trigger a Databricks job?
@Hubert Dudek , Thank you for your response. My question was: Does the cluster have to be on all the time to take advantage of Auto Loader? What happens if a file arrives in the blob storage and the cluster was down. Does it automatically start the ...