cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Can you use the Secrets API 2.0 in a Delta Live Tables configuration?

MetaRossiVinli
Contributor

Is the Secrets API 2.0 not applied to Delta Live Tables configurations? I understand that the Secrets API 2.0 is in public preview and this use case may not be supported, yet. I tried the following and both do not work for the stated reasons.

In a DLT configuration JSON under the "configuration" section, this is not valid as it is not valid JSON:

"credentials": {{secrets/my-creds/service-credentials}},

And this is valid JSON, but is not replaced by the Secrets API. The curly braces and full string are loaded into my Spark environment.

"credentials": "{{secrets/my-creds/service-credentials}}",

I am able to load the secret with the following code in a cell in a DLT notebook. So, this is a valid workaround, but I would like to have it in the DLT config.

a = dbutils.secrets.get(scope="my-creds", key="service-credentials")
spark.conf.set("credentials", a)

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

@Kevin Rossi​ :

As a workaround, you can use the code you provided to load the secret in a cell in a DLT notebook and set it in the Spark configuration. This will allow you to use the secret in your DLT code.

Another workaround could be to store the credentials in a configuration file or environment variable and read it in your DLT code using the appropriate method. For example, you could store the credentials in a YAML file and read it using the PyYAML library in Python or the yaml library in R. This method may be more flexible and easier to manage than storing the credentials in the DLT configuration.

View solution in original post

2 REPLIES 2

Anonymous
Not applicable

@Kevin Rossi​ :

As a workaround, you can use the code you provided to load the secret in a cell in a DLT notebook and set it in the Spark configuration. This will allow you to use the secret in your DLT code.

Another workaround could be to store the credentials in a configuration file or environment variable and read it in your DLT code using the appropriate method. For example, you could store the credentials in a YAML file and read it using the PyYAML library in Python or the yaml library in R. This method may be more flexible and easier to manage than storing the credentials in the DLT configuration.

I will continue using the dbutils.secrets workaround. Probably by making it a tiny function so that I can import it to multiple jobs. Solved for now.

If it is not already on the roadmap, I suggest adding the ability to read from the secrets API in the DLT configuration. This is consistent with how secrets can be provided to other clusters; without needing to handle DLT environment concerns within the notebook. I am guessing that this is already on the roadmap.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group