cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to parameterize key of spark config in the job clusterlinked service from ADF

KVNARK
Honored Contributor II

how can we parameterize key of the spark-config in the job cluster linked service from Azure datafactory, we can parameterize the values but any idea how can we parameterize the key so that when deploying to further environment it takes the PROD/QA values. Thanks much in advance!!!

1 ACCEPTED SOLUTION

Accepted Solutions

daniel_sahal
Esteemed Contributor

@KVNARK .​ 

You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)

View solution in original post

4 REPLIES 4

daniel_sahal
Esteemed Contributor

@KVNARK .​ 

You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)

KVNARK
Honored Contributor II

Thanks a lot. This helps @Daniel Sahal​ 

KVNARK
Honored Contributor II

@Daniel Sahal​ - what ever the links you share for any requirement is really very much helpful. Kudos...

daniel_sahal
Esteemed Contributor

@KVNARK .​ Thanks! I'm always happy to help 🙂

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.