Yes, you're correct. The downstream tasks would still be triggered. You can try cancelling the run using the Jobs API? But I don't think that is what you are looking for as you want the job run to have success state.Maybe propagating the check using ...
You can use the dbutils.notebook.exit("returnValue") as explained here stop execution of a notebook gracefully. if condition:
pass
else:
dbutils.notebook.exit("Aborting as ondition not met. Further tasks will be skipped")
If you are on Azure,this could be due to an outage in Azure West Europe region.https://status.azuredatabricks.net/pages/incident/5d49ec10226b9e13cb6a422e/64a52da53e85fa053c159480Looks like a number of fiber links between datacenters have become unava...
I think if you copy the underlying files of the source table to a new location along with the _delta_log/ you effectively have the full history. Given you have external location setup for unity catalog you can then create an table on this location.An...