cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

"SparkSecurityException: Cannot read sensitive key" error when reading key from Spark config

Cassio
New Contributor II

In Databricks 10.1 it is possible to define in the "Spark Config" of the cluster something like:

spark.fernet {{secrets/myscope/encryption-key}} . In my case my scopes are tied to Azure Key Vault.

With that I can make a query as follows:

%sql
 
SELECT default.udfDecrypt('my encrypted data', "${spark.fernet}") );

However starting from Databricks version 10.2 I get the following message "Error in SQL statement: SparkSecurityException: Cannot read sensitive key 'spark.fernet' from secure provider"

I've already looked in the Databricks 10.2 change log for something that could have caused this, but I couldn't find it.

Has this functionality been completely removed or can I enable this configuration reading?

1 ACCEPTED SOLUTION

Accepted Solutions

Ravi
Databricks Employee
Databricks Employee

@Cassio Eskelsen​  Using secret in select query with $ syntax is now blocked due to security reasons in the new DBRs(10.2+) and we will soon block it in all supported DBRs with future releases. This is the reason why we are getting the error in DBR 10.2 but not on previous DBR versions.

You can add the following Spark configuration to your cluster setting and it will disable the validation that has been added to new DBRs. This has to be added at the cluster level.

  • spark.databricks.secureVariableSubstitute.enabled false

View solution in original post

4 REPLIES 4

Ravi
Databricks Employee
Databricks Employee

@Cassio Eskelsen​  Using secret in select query with $ syntax is now blocked due to security reasons in the new DBRs(10.2+) and we will soon block it in all supported DBRs with future releases. This is the reason why we are getting the error in DBR 10.2 but not on previous DBR versions.

You can add the following Spark configuration to your cluster setting and it will disable the validation that has been added to new DBRs. This has to be added at the cluster level.

  • spark.databricks.secureVariableSubstitute.enabled false

Cassio
New Contributor II

Thanks, Ravi! That's solved my problem!😍

Anonymous
Not applicable

Thanks for selecting the best answer @Cassio Eskelsen​ ! It's super helpful and lets us know you got what you needed! 🙌🏻

Soma
Valued Contributor

This solution exposes the entire secret if I use

commands like below

sql("""explain select upper("${spark.fernet.email}") as data """).display()

Please dont use this

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group