Suppress output in python notebooks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-01-2023 02:27 PM
My dilemma is this - We use PySpark to connect to external data sources via jdbc from within databricks. Every time we issue a spark command, it spits out the connection options including the username, url and password which is not advisable. So, is there a way we can suppress the outputs or maybe even redirect them to some place without printing it on the console?
I did a google search and it said that it is not possible to suppress the output from Python notebooks. But, I want to hear from the experts - is there a way, please?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-01-2023 10:17 PM
Hi @Priya Venkatesh ,
From notebook dropdown you can hide the result:
Also, you can try encoding it through base64.
https://docs.python.org/3/library/base64.html
Please let us know if this helps.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-03-2023 11:49 AM
Databricks recommends using Databricks secrets for sensitive information like usernames and passwords. These get redacted in the prints and logs and are not visible to other users in the workspace in case the notebook is accessible to other team members.
https://docs.databricks.com/security/secrets/index.html
Here is an example of it:
https://docs.databricks.com/security/secrets/example-secret-workflow.html
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-16-2023 10:05 PM
Hi @Priya Venkatesh
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2023 04:47 AM
You can use `%%capture` to suppress output, but I still face the same problem where the spark configuration is not correctly transferred from the setup notebook to the running notebook.
%%capture
%run ../utils/setup.ipynb
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-09-2023 10:46 PM
Thanks for taking the time to discuss this, I feel strongly about it and love learning more on this topic.