@vivi007
Yes, you can create dependencies between jobs in different DABs (Databricks Asset Bundles), but this requires a different approach than task dependencies within a single DAB.
Since DABs are designed to be independently deployable units, direct dependencies between them aren't built into the DAB structure itself. However, you can implement cross-DAB job dependencies using a few methods:
Databricks Workflows API: You can create a programmatic solution using the Workflows API to check the completion status of a job in one DAB before triggering a job in another DAB.
External orchestration tools: Tools like Apache Airflow can orchestrate workflows across multiple DABs by monitoring job completion and triggering dependent jobs.
Event-based triggers: Set up the first DAB's job to publish an event (e.g., to a webhook or message queue) upon completion, and configure the second DAB's job to start when it receives this event.
Delta Live Tables (DLT) Pipeline Dependencies: If you're using DLT pipelines, you can create dependencies between pipelines that exist in different DABs.
Scheduled dependencies: Configure the dependent job with a schedule that starts after the expected completion time of the prerequisite job, possibly with additional checks.
Remember that when implementing cross-DAB dependencies, you'll need to handle error scenarios and retries appropriately, as well as ensure proper access controls are in place between the different asset bundles.
LR