We have a use case where Job C should start only after both Job A and Job B have successfully completed.
In Airflow, we achieve this using an ExternalTaskSensor to set dependencies across different DAGs.
Is there a way to configure something similar in Databricks, so that Job C automatically triggers only after Job A and Job B are finished?
I looked through the documentation but couldn't find anything specific for this scenario. Any guidance or best practices would be appreciated!