cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Incorrect secret value when loaded as environment variable

ksilva
New Contributor

I recently faced an issue that took good hours to identify.

I'm loading an environment variable with a secret

ENVVAR: {{secrets/scope/key}}

The secret is loaded in my application, I could verify it's there, but its value is not correct. I realised that after comparing the environment variable value with the value of the secret when loading using dbutils

envvar = dbutils.secrets.get(scope="scope", key="key")

After a lot of trial and error, I realised that the problem is because my secret value has a dollar sign in it, and when loading through env vars the dollar sign and everything after it disappears, I suspect that it's somehow evaluated as an environment variable and gets replaced as well.

Example:

print(dbutils.secrets.get(scope="scope", key="key"))
# this would be "mysecret$here"
print(os.environ["ENVVAR"])
# this would be "mysecret"

Is it a bug on Databricks or is it something else?

2 REPLIES 2

User16752242622
Valued Contributor

Hi @kleber silva​ 

There was a known issue which has been resolved now. That is when a $ character is included in a secret value, the $ and all subsequent text are truncated.

Although your question is actually related to how spark parse the value as an env variable, the fix for this is still in progress. We don't have an ETA for this yet.

Thank you for reporting this to us.

Hi @kleber silva​,

Just a friendly follow-up. Did @Akash Bhat​ response help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.