cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Why Python logs shows [REDACTED] literal in spaces when I use dbutils.secrets.get in my code?

Abel_Martinez
Contributor

When I use  dbutils.secrets.get in my code, spaces in the log are replaced by "[REDACTED]" literal. This is very annoying and makes the log reading difficult. Any idea how to avoid this?

See my screenshot...

11 REPLIES 11

Sivaprasad1
Valued Contributor II

Hi @Abel Martinezโ€‹ : That is by design. The secrets won't be visible in the notebook which will be hidden.

https://docs.databricks.com/security/secrets/redaction.html

UmaMahesh1
Honored Contributor III

The reason why we store credentials as databricks secrets is to protect credentials when you run your jobs / notebooks. However to avoid accidental display of these secrets, databricks redacts these secrets.

The same can be found in the below documentation.

https://docs.databricks.com/security/secrets/redaction.html

UmaMahesh1
Honored Contributor III

As for the second part, if you want to get those secrets, there is a workaround.

secrets = dbutils.secrets.get(scope="scopeName", key="KeyName")

for value in secrets :

print(value , end=" ")

Hope this helps..

Abel_Martinez
Contributor

Hi all, thanks for your feedback. I don't want to show the secrets, what I want is that Python output doesn't show spaces as [REDACTED]. I'm not printing the secrets, the text with [REDACTED] instead of spaces is the normal output for the Python notebook and exceptions.

UmaMahesh1
Honored Contributor III

REDACTED is a placeholder defined.

If you want to replace that in the output, for visual purposes you can do a bit of playing with the code to get there.

In the notebook cell where you are getting that output/error use the below assigning the output or error to a variable.

import sys

from IPython.utils.capture import CapturedIO

capture = CapturedIO(sys.stdout, sys.stderr)

..... <your code>

var1 = capture.stdout #use stdout or stderr as fit

When you print this var1, ideally the placeholder should be displaying as blank, but if it still displays, you can always use a .replace python function for that.

Please do let me know if this works...Cheers.

Not the solution that I was finding but it works as a workaround.

Thanks!

Hi @Abel Martinezโ€‹, It would mean a lot if you could select the "Best Answer" to help others find the correct answer faster.

This makes that answer appear right after the question, so it's easier to find within a thread.

It also helps us mark the question as answered so we can have more eyes helping others with unanswered questions.

Can I count on you?

SS2
Valued Contributor

You can not see the secret inโ€‹ Databricks notebook. It will always show REDACTED. Even if in data you are getting same value that are same as secret then in that case you will always get REDACTED for that data.

Ajay-Pandey
Esteemed Contributor II

Databricks redacts secret values that are read using 

dbutils.secrets.get(). When displayed in notebook cell output, the secret values are replaced with 

[REDACTED]

Although it is not recommended, there is a workaround to see actual value with a simple 

for loop trick. So, you will get the value separated by spaces.

value = dbutils.secrets.get(scope="myScope", key="myKey")
 
for char in value:
    print(char, end=" ")
 
Out:
y o u r _ v a l u e

labtech
Valued Contributor II

Yes, obviously that you can not see the value in secret scope. That is important behavior of secret scope.

jlb0001
New Contributor III

I ran into the same issue and found that the reason was that the notebook included some test keys with values of "A" and "B" for simple testing. I noticed that any string with a substring of "A" or "B" was "[REDACTED]".

โ€‹

So, in my case, it was an easy fix ... just use less-common values for my test secrets, which meant that the silly and annoying redaction went away. Thanks to this thread for helping me realize this was the issue!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.