โ01-16-2023 10:25 PM
how can we parameterize key of the spark-config in the job cluster linked service from Azure datafactory, we can parameterize the values but any idea how can we parameterize the key so that when deploying to further environment it takes the PROD/QA values. Thanks much in advance!!!
โ01-17-2023 12:07 AM
@KVNARK .โ
You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)
โ01-17-2023 12:07 AM
@KVNARK .โ
You can use Databricks Secrets (create a Secret scope from AKV https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes) and then reference a secret in spark configuration (https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secrets)
โ01-17-2023 01:57 AM
Thanks a lot. This helps @Daniel Sahalโ
โ01-17-2023 02:10 AM
@Daniel Sahalโ - what ever the links you share for any requirement is really very much helpful. Kudos...
โ01-17-2023 02:11 AM
@KVNARK .โ Thanks! I'm always happy to help ๐
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group