cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Deploying global parameters from lower to higher env in ADF

KVNARK
Honored Contributor II

how can we deploy global parameters from dev to higher environments in ADF. Could anyone throw some light on this.

I'm using GIT in DEV and deploying it to PROD using Azure CICD pipeline.

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

@KVNARK .โ€‹ : To deploy global parameters from dev to higher environments in Azure Data Factory (ADF), you can follow these steps:

  1. In your DEV environment, create the global parameters in ADF and save them.
  2. Commit and push the changes to your Git repository.
  3. Set up a build pipeline in Azure DevOps to build the ADF ARM template.
  4. Set up a release pipeline in Azure DevOps to deploy the ADF ARM template to higher environments.
  5. In the release pipeline, add a task to update the global parameters in the ADF instance using Azure PowerShell or Azure CLI.
  6. Use Azure Key Vault to store and manage the secrets for your global parameters.
  7. Grant the necessary permissions to access the Azure Key Vault secrets to the service principal or managed identity used by your ADF instance.
  8. In your ADF pipeline, reference the global parameters using the syntax @pipeline().globalParameters.<parameter_name>

By following these steps, you can ensure that the global parameters are deployed along with the ADF ARM template and are available in higher environments.

View solution in original post

1 REPLY 1

Anonymous
Not applicable

@KVNARK .โ€‹ : To deploy global parameters from dev to higher environments in Azure Data Factory (ADF), you can follow these steps:

  1. In your DEV environment, create the global parameters in ADF and save them.
  2. Commit and push the changes to your Git repository.
  3. Set up a build pipeline in Azure DevOps to build the ADF ARM template.
  4. Set up a release pipeline in Azure DevOps to deploy the ADF ARM template to higher environments.
  5. In the release pipeline, add a task to update the global parameters in the ADF instance using Azure PowerShell or Azure CLI.
  6. Use Azure Key Vault to store and manage the secrets for your global parameters.
  7. Grant the necessary permissions to access the Azure Key Vault secrets to the service principal or managed identity used by your ADF instance.
  8. In your ADF pipeline, reference the global parameters using the syntax @pipeline().globalParameters.<parameter_name>

By following these steps, you can ensure that the global parameters are deployed along with the ADF ARM template and are available in higher environments.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group