โ01-16-2023 10:25 PM
how can we parameterize key of the spark-config in the job cluster linked service from Azure datafactory, we can parameterize the values but any idea how can we parameterize the key so that when deploying to further environment it takes the PROD/QA values. Thanks much in advance!!!
โ01-17-2023 12:07 AM
@KVNARK .โ
You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)
โ01-17-2023 12:07 AM
@KVNARK .โ
You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)
โ01-17-2023 01:57 AM
Thanks a lot. This helps @Daniel Sahalโ
โ01-17-2023 02:10 AM
@Daniel Sahalโ - what ever the links you share for any requirement is really very much helpful. Kudos...
โ01-17-2023 02:11 AM
@KVNARK .โ Thanks! I'm always happy to help ๐
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now