Hi All,
I am new to databricks need some understanding for my requirement .
our requirement:
a: we have zip file in azure blob storage and we are bringing that file to dbfs and unzip that file and executing our transformations in multiple steps (3 steps) and every step is in different notebook.so we have used 3 notebook.
b. output of 1st notebook becomes input data for next notebook have use have done through tables created in database in dbfs.
c. Same goes with 3rd notebook also which uses the output produced from 2nd notebook.
d. When we perform these activity through interactive cluster it all goes fine .
question:
1: In production step up right now in org we have job cluster which will spin off for every notebook can we have access to same dbfs from multiple job cluster or we need to mount some storage to handle this scenario.
Regards,
Praveen