cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Soumitra Dutta : How do I debug exceptions in Databricks notebooks?

soumitradutta
New Contributor

Hello,

My name is Soumitra Dutta,I’m currently in the USA and running into some exceptions in my Databricks notebook. For anyone who has dealt with similar issues, what steps do you usually take to debug them? I’d really appreciate your suggestions.”

 

Soumitra Dutta
1 ACCEPTED SOLUTION

Accepted Solutions

CerberusByte
Databricks Employee
Databricks Employee

Hi Soumitra, this is quite a broad question as it depends on what kind of error you are seeing. If there is something specific, please reply here for something more tailored. In general, when debugging I usually start with the Databricks Assistant as that is built into the notebooks and has been training to understand Databricks syntax and features. It can be run with `/fix` or by clicking the purple star icon.

If the error is in your code syntax such as with Python and SQL, usually a Google search can quickly provide insights, but the later Python versions available also have verbose error messages that can help point you in the correct direction. If the error is actually with the underlying Spark code, looking through the trace to identify if it is an access issue, or a data issue, etc. will help point you to ask the right question of the Assistant / copilot / Google to fix it.

Depending on your issue, running similar cells around what failed such as changing the syntax or checking that access to that data works is always a good sanity check. Again, if you have an example exception that might help provide better advice.

View solution in original post

1 REPLY 1

CerberusByte
Databricks Employee
Databricks Employee

Hi Soumitra, this is quite a broad question as it depends on what kind of error you are seeing. If there is something specific, please reply here for something more tailored. In general, when debugging I usually start with the Databricks Assistant as that is built into the notebooks and has been training to understand Databricks syntax and features. It can be run with `/fix` or by clicking the purple star icon.

If the error is in your code syntax such as with Python and SQL, usually a Google search can quickly provide insights, but the later Python versions available also have verbose error messages that can help point you in the correct direction. If the error is actually with the underlying Spark code, looking through the trace to identify if it is an access issue, or a data issue, etc. will help point you to ask the right question of the Assistant / copilot / Google to fix it.

Depending on your issue, running similar cells around what failed such as changing the syntax or checking that access to that data works is always a good sanity check. Again, if you have an example exception that might help provide better advice.