cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Suppress output in python notebooks

PriyaV
New Contributor II

My dilemma is this - We use PySpark to connect to external data sources via jdbc from within databricks. Every time we issue a spark command, it spits out the connection options including the username, url and password which is not advisable. So, is there a way we can suppress the outputs or maybe even redirect them to some place without printing it on the console?

I did a google search and it said that it is not possible to suppress the output from Python notebooks. But, I want to hear from the experts - is there a way, please?

5 REPLIES 5

Debayan
Databricks Employee
Databricks Employee

Hi @Priya Venkateshโ€‹ ,

From notebook dropdown you can hide the result:

imageAlso, you can try encoding it through base64.

https://docs.python.org/3/library/base64.html

Please let us know if this helps.

arun_pamulapati
Databricks Employee
Databricks Employee

Databricks recommends using Databricks secrets for sensitive information like usernames and passwords. These get redacted in the prints and logs and are not visible to other users in the workspace in case the notebook is accessible to other team members.

https://docs.databricks.com/security/secrets/index.html

Here is an example of it:

https://docs.databricks.com/security/secrets/example-secret-workflow.html

Anonymous
Not applicable

Hi @Priya Venkateshโ€‹ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

fried
New Contributor II

You can use `%%capture` to suppress output, but I still face the same problem where the spark configuration is not correctly transferred from the setup notebook to the running notebook.

%%capture
%run ../utils/setup.ipynb

Pabeggetur
New Contributor II

Thanks for taking the time to discuss this, I feel strongly about it and love learning more on this topic.

youi contact hours

uber eats complaints

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group