In Databricks 10.1 it is possible to define in the "Spark Config" of the cluster something like:
spark.fernet {{secrets/myscope/encryption-key}} . In my case my scopes are tied to Azure Key Vault.
With that I can make a query as follows:
%sql
SELECT default.udfDecrypt('my encrypted data', "${spark.fernet}") );
However starting from Databricks version 10.2 I get the following message "Error in SQL statement: SparkSecurityException: Cannot read sensitive key 'spark.fernet' from secure provider"
I've already looked in the Databricks 10.2 change log for something that could have caused this, but I couldn't find it.
Has this functionality been completely removed or can I enable this configuration reading?