โ02-21-2022 11:38 AM
In Databricks 10.1 it is possible to define in the "Spark Config" of the cluster something like:
spark.fernet {{secrets/myscope/encryption-key}} . In my case my scopes are tied to Azure Key Vault.
With that I can make a query as follows:
%sql
SELECT default.udfDecrypt('my encrypted data', "${spark.fernet}") );
However starting from Databricks version 10.2 I get the following message "Error in SQL statement: SparkSecurityException: Cannot read sensitive key 'spark.fernet' from secure provider"
I've already looked in the Databricks 10.2 change log for something that could have caused this, but I couldn't find it.
Has this functionality been completely removed or can I enable this configuration reading?
โ02-21-2022 10:23 PM
@Cassio Eskelsenโ Using secret in select query with $ syntax is now blocked due to security reasons in the new DBRs(10.2+) and we will soon block it in all supported DBRs with future releases. This is the reason why we are getting the error in DBR 10.2 but not on previous DBR versions.
You can add the following Spark configuration to your cluster setting and it will disable the validation that has been added to new DBRs. This has to be added at the cluster level.
โ02-21-2022 10:23 PM
@Cassio Eskelsenโ Using secret in select query with $ syntax is now blocked due to security reasons in the new DBRs(10.2+) and we will soon block it in all supported DBRs with future releases. This is the reason why we are getting the error in DBR 10.2 but not on previous DBR versions.
You can add the following Spark configuration to your cluster setting and it will disable the validation that has been added to new DBRs. This has to be added at the cluster level.
โ02-22-2022 03:32 AM
Thanks, Ravi! That's solved my problem!๐
โ02-22-2022 01:52 PM
Thanks for selecting the best answer @Cassio Eskelsenโ ! It's super helpful and lets us know you got what you needed! ๐๐ป
โ07-15-2022 01:30 AM
This solution exposes the entire secret if I use
commands like below
sql("""explain select upper("${spark.fernet.email}") as data """).display()
Please dont use this
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group