cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to parameterize key of spark config in the job clusterlinked service from ADF

KVNARK
Honored Contributor II

how can we parameterize key of the spark-config in the job cluster linked service from Azure datafactory, we can parameterize the values but any idea how can we parameterize the key so that when deploying to further environment it takes the PROD/QA values. Thanks much in advance!!!

1 ACCEPTED SOLUTION

Accepted Solutions

daniel_sahal
Esteemed Contributor

@KVNARK .โ€‹ 

You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)

View solution in original post

4 REPLIES 4

daniel_sahal
Esteemed Contributor

@KVNARK .โ€‹ 

You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)

KVNARK
Honored Contributor II

Thanks a lot. This helps @Daniel Sahalโ€‹ 

KVNARK
Honored Contributor II

@Daniel Sahalโ€‹ - what ever the links you share for any requirement is really very much helpful. Kudos...

daniel_sahal
Esteemed Contributor

@KVNARK .โ€‹ Thanks! I'm always happy to help ๐Ÿ™‚

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now