Pass secret in spark config when value is in form a.b.c={{secrets/scope/secret}}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-09-2024 04:03 AM
I am configuring the Cluster for a spark-submit task and I am trying to specify `spark.executor.extraJavaOptions a.b.c={{secrets/scope/secret}}` but the literal {{secrets/scope/secret}} is being passed in rather than the secret value itself.
I know the convention for secrets is `spark.conf.something {{secrets/scope/secret}}` when the value is the secret itself and this substitution works fine for me in similar cases. Given my value in the above is a.b.c={{secrets/scope/secret}} the secret substitution is not working. I have also tried with an environment variable ie `spark.executor.extraJavaOptions a.b.c=${MY_ENV_VAR}` but this is also not working.
I suspect that given the {{secrets/scope/secret}} does not have a space character before it but instead `=` the substitution isn't working properly.
Any suggestions on how to achieve this would be greatly appreciated!

