I am creating cluster using rest api call but every-time it is creating all purpose cluster. Is there a way to create job cluster and run notebook using python code?
If i use databricks fs cp then it does not overwrite the existing file, it just skip copying the file. Any suggestion how to overwrite the file using databricks cli?
I have one requirement where I need to delete older data from a folder, and folder data can be partitioned or without partition. What approach to follow to build this utility?
Hi @Vidula Khanna @Kaniz Fatma It does not answer my real question. We mount adls path in databricks where we read and write the data. Suppose I have one partitioned (partitioned on audit) folder like below. I want to build an utility to delete the...