cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Cannot use Databricks REST API for secrets "get" inside bash script (byte format only)

m997al
Contributor II

Hi,

I am trying to use Databricks-backed secret scopes inside Azure DevOps pipelines.  I am almost successful.

I can use the REST API to "get" a secret value back inside my bash script, but, the value is in byte format, so it is unusable as a local variable in my script (for other logins, etc.)

I know that dbutils can be used inside Databricks notebooks, but that is not my use case here.  I want to use the secrets from the Databricks-backed secret scopes as key vaults that I can access from inside my ADO pipeline yaml files.

Does anyone know what the solution might be?

3 REPLIES 3

Kaniz_Fatma
Community Manager
Community Manager

Hi @m997al, First, verify that permissions are correctly set for the schema and tables. If thatโ€™s not the issue, try refreshing the metadata using the command `REFRESH TABLE <your_table_name>;`. Also, check if any tables are hidden or if filters might be affecting their visibility. If the problem persists, consider recreating the schema and tables. 

Howโ€™s your migration to Unity Catalog going? Any specific challenges youโ€™re facing there?

m997al
Contributor II

Hi - I think I have discovered the issue.  In my bash script in the ADO pipeline, I tried to assign the value from the "get" to a local variable in the yaml file, for example:

$secret_val=$(<get secret from Databricks scope via REST API>)
<another external REST API call to a non-Databricks service, using bearer: $secret_val>

But this should work (I have seen it work elsewhere):

<another external REST API call to a non-Databricks service, using bearer: $(get secret from Databricks scope vai REST API)>

...i.e., you have to use the secret on-the-fly, rather than assign it to a variable and use the secondary variable.

Thanks!

 

m997al
Contributor II

I wanted to add an addendum to this.  So in Azure DevOps, when working with yaml files, you can use the Azure DevOps pipelines "Library" to load environment variables.  When you look at those environment variables, in the Azure DevOps pipeline library, there is a way to "lock" the variable so that it is not plain-text (and thus masked by asterisks so that no one else with access to the library can see it).

Where is this going?  So it turns out (and this is all Azure DevOps pipelines thus far we are talking about) that when a pipeline library variable is masked, it can only be used in the yaml file "on-the-fly"...meaning you cannot assign the environment variable to another local yaml file variable, such as "mytempvar=$(library_masked_variable)"... 

That will not work as "$mytempvar" will just be encrypted and unuseable.

I think the same thing is going on with secrets that are pulled from a Databricks secret scope and used inside Azure DevOps pipeline yaml files.  This might have more to do with Azure DevOps and the way it processes bash scripting in yaml files than it does with anything about Azure Databricks secret scopes and the REST API.

Hope that helps.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group