I am using this given code to read from a source location in ADLS Gen 2 Azure Storage Container. core_df = ( spark.readStream.format("cloudFiles") .option("cloudFiles.format", "json") .option("multiLine", "false") .option(...
Hello,I have created a job with no timeout-seconds provided. But I am getting Error: Timed out within 20 minutes. I am running the below commands using Bash@3 task in ADO Pipeline yaml file. The code for the same is given belowtask: Bash@3 timeoutIn...
Hey @szymon_dybczak ,I checked my Storage Container and I am not getting any folder with the given checkpoint path. Do I need to setup anything differently for that? The concern which I am having is that the Incremental load was working well when usi...
Hello @szymon_dybczak ,I have provided the required permissions for the Service principal. The Queue is generated in the Storage Account. We are actually not using writeStream functionalities, we are using Unity Catalog and storing the data via Delta...
Hey Fatma,Thanks for your response. I got the solution. Actually when using Databricks CLI and triggering run-now from CLI it is having default timeout of 20 minutes. I have changed that and now it is working as expected.Thanks,Ibrahim