cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

pdb debugger on databricks

johnp
New Contributor III

I am new to databricks. and trying to debug my python application with variable-explore by following the instruction from: https://www.databricks.com/blog/new-debugging-features-databricks-notebooks-variable-explorer

I added the "import pdb" in the first cell, then "pdb.set_trace()"  inside the functions I wanted to debug. After running all the cells,  the databricks seems stopped at the pdb.set_trace(), but I did not get the cell for ipdb prompt.  Also the list of variables does not updated ( the local variables did not show on the right side of panel).   Is there something I missing here?

From another help page: https://docs.databricks.com/en/_extras/notebooks/source/python-debugger.html, just wonder if I need to set %debug or %pdb instead of %python to get ipdb prompt. But this page seems quite old, not sure it is still valid.

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @johnp

  • In Databricks, you don’t need to use %debug or %pdb magic commands like you would in a regular Python environment.
  • Instead, simply use pdb.set_trace() to trigger the debugger.
  • While paused at the breakpoint, you should be able to inspect variables.
  • The Variable Explorer panel on the right side of the notebook should automatically update with the state of the notebook at that breakpoint.
  • You can view local variables, DataFrame contents, and other relevant information.
  • If you’re still encountering issues, ensure that your notebook environment is set up correctly.
  • Double-check that you’re not accidentally using %python magic commands (which are not needed for debugging).
  • If you have any more questions, feel free to ask! 😊

    Learn more about the Variable Explorer in our developer documentation1.

View solution in original post

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @johnp

  • In Databricks, you don’t need to use %debug or %pdb magic commands like you would in a regular Python environment.
  • Instead, simply use pdb.set_trace() to trigger the debugger.
  • While paused at the breakpoint, you should be able to inspect variables.
  • The Variable Explorer panel on the right side of the notebook should automatically update with the state of the notebook at that breakpoint.
  • You can view local variables, DataFrame contents, and other relevant information.
  • If you’re still encountering issues, ensure that your notebook environment is set up correctly.
  • Double-check that you’re not accidentally using %python magic commands (which are not needed for debugging).
  • If you have any more questions, feel free to ask! 😊

    Learn more about the Variable Explorer in our developer documentation1.

johnp
New Contributor III

I test with some simple applications, it works as you described.  However, the application I am debugging uses the pyspark structured streaming, which runs continuously. After inserting pdb.set_trace(), the application paused at the breakpoint, but the application continues to run for incoming stream data, and no ipdb prompt shows up.  How do I get the ipdb> prompt in this case?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group