by
PriyaV
• New Contributor II
- 15607 Views
- 5 replies
- 10 kudos
My dilemma is this - We use PySpark to connect to external data sources via jdbc from within databricks. Every time we issue a spark command, it spits out the connection options including the username, url and password which is not advisable. So, is ...
- 15607 Views
- 5 replies
- 10 kudos
Latest Reply
Thanks for taking the time to discuss this, I feel strongly about it and love learning more on this topic.youi contact hoursuber eats complaints
4 More Replies
- 1137 Views
- 1 replies
- 0 kudos
Is there an equivalent of the %debug from Jupyter notebooks in Databricks notebooks for debugging python notebooks?
- 1137 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Nathan Tong You can go through the 2 articles below that I found online for Debugging in Databricks.1. 7 Tips to Debug Apache Spark Code Faster with Databricks 2. Easier Spark Code Debugging
- 1898 Views
- 1 replies
- 0 kudos
Do we have an analogous concept to package cells for Python notebooks?
- 1898 Views
- 1 replies
- 0 kudos
Latest Reply
You can just declare your classes and in one cell, and use them in the others. It is recommended to get all your classes in one notebook, and use %run in the other to "import" those classes.The one thing you cannot do is to literally import a folder/...