cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

"SparkSecurityException: Cannot read sensitive key" error when reading key from Spark config

Cassio
New Contributor II

In Databricks 10.1 it is possible to define in the "Spark Config" of the cluster something like:

spark.fernet {{secrets/myscope/encryption-key}} . In my case my scopes are tied to Azure Key Vault.

With that I can make a query as follows:

%sql
 
SELECT default.udfDecrypt('my encrypted data', "${spark.fernet}") );

However starting from Databricks version 10.2 I get the following message "Error in SQL statement: SparkSecurityException: Cannot read sensitive key 'spark.fernet' from secure provider"

I've already looked in the Databricks 10.2 change log for something that could have caused this, but I couldn't find it.

Has this functionality been completely removed or can I enable this configuration reading?

1 ACCEPTED SOLUTION

Accepted Solutions

Ravi
Valued Contributor
Valued Contributor

@Cassio Eskelsen​  Using secret in select query with $ syntax is now blocked due to security reasons in the new DBRs(10.2+) and we will soon block it in all supported DBRs with future releases. This is the reason why we are getting the error in DBR 10.2 but not on previous DBR versions.

You can add the following Spark configuration to your cluster setting and it will disable the validation that has been added to new DBRs. This has to be added at the cluster level.

  • spark.databricks.secureVariableSubstitute.enabled false

View solution in original post

4 REPLIES 4

Ravi
Valued Contributor
Valued Contributor

@Cassio Eskelsen​  Using secret in select query with $ syntax is now blocked due to security reasons in the new DBRs(10.2+) and we will soon block it in all supported DBRs with future releases. This is the reason why we are getting the error in DBR 10.2 but not on previous DBR versions.

You can add the following Spark configuration to your cluster setting and it will disable the validation that has been added to new DBRs. This has to be added at the cluster level.

  • spark.databricks.secureVariableSubstitute.enabled false

Cassio
New Contributor II

Thanks, Ravi! That's solved my problem!😍

Anonymous
Not applicable

Thanks for selecting the best answer @Cassio Eskelsen​ ! It's super helpful and lets us know you got what you needed! 🙌🏻

Soma
Valued Contributor

This solution exposes the entire secret if I use

commands like below

sql("""explain select upper("${spark.fernet.email}") as data """).display()

Please dont use this

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.