cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to parameterize key of spark config in the job clusterlinked service from ADF

KVNARK
Honored Contributor II

how can we parameterize key of the spark-config in the job cluster linked service from Azure datafactory, we can parameterize the values but any idea how can we parameterize the key so that when deploying to further environment it takes the PROD/QA values. Thanks much in advance!!!

1 ACCEPTED SOLUTION

Accepted Solutions

daniel_sahal
Esteemed Contributor

@KVNARK .โ€‹ 

You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)

View solution in original post

4 REPLIES 4

daniel_sahal
Esteemed Contributor

@KVNARK .โ€‹ 

You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)

KVNARK
Honored Contributor II

Thanks a lot. This helps @Daniel Sahalโ€‹ 

KVNARK
Honored Contributor II

@Daniel Sahalโ€‹ - what ever the links you share for any requirement is really very much helpful. Kudos...

daniel_sahal
Esteemed Contributor

@KVNARK .โ€‹ Thanks! I'm always happy to help ๐Ÿ™‚

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.