Hi @ElaPG, The limit of 100 concurrent pipeline updates in an Azure Databricks workspace refers to the number of simultaneous modifications that can occur within the workspace.
Letโs break it down:
Pipeline Logic Changes: Any alteration to the pipeline configuration, such as modifying the workflow structure, adjusting data transformations, or changing dependencies, counts as an update. For instance, if you add a new stage to your ETL pipeline or adjust the logic within an existing stage, it contributes to the concurrent update count.
Pipeline Runs: Each time a pipeline is executed, it constitutes an update. Whether itโs a scheduled run, triggered manually, or part of an automated process, each execution consumes one of the available concurrent update slots.
In summary, both changes in pipeline logic and pipeline runs contribute to the limit of 100 concurrent updates. Keep this in mind while managing your Azure Databricks workspace to ensure efficient utilization of resources. ๐