cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Error handling - SQL states

noorbasha534
Contributor

Dear all,

Few questions please - 

1. Has anyone successfully used the below way of dealing with error handling in PySpark (example: that contains data frames) as well as SQL code based notebooks -

 

from pyspark.errors import PySparkException

try:
  spark.sql("SELECT * FROM does_not_exist").show()
except PySparkException as ex:
  print("Error Class       : " + ex.getErrorClass())
  print("Message parameters: " + str(ex.getMessageParameters()))
  print("SQLSTATE          : " + ex.getSqlState())
  print(ex)

 

2. With this approach, is it advisable to log errors into tables? I think of having an errors table with 4 columns to capture the date, error class, message parameter and the sqlstate.

3. Currently, we are logging all errors as ".txt" files in an ADLS storage account. The idea is to produce an operational dashboard on the top of the errors. I think table based error logging could be more simpler to report in contrast to profiling the ADLS storage account/containers/folders periodically and report then.

4. Also, I noticed that when we capture and log errors as ".txt" files, the error message at times is very detailed spanning into 100s of lines; not sure if it is the same on your end.

Appreciate a fruitful discussion on this.

1 REPLY 1

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @noorbasha534,

 

  • The approach you mentioned for error handling in PySpark using PySparkException is a valid method. It allows you to catch specific exceptions related to PySpark operations and handle them accordingly.

  • Logging errors into tables is advisable, especially if you plan to create an operational dashboard to monitor and analyze errors. Having an errors table with columns for the date, error class, message parameter, and SQL state can simplify the process of querying and reporting errors.

  • Transitioning from logging errors as ".txt" files in an ADLS storage account to logging them into tables can indeed simplify reporting. Table-based error logging allows for more straightforward querying and analysis using SQL, which can be more efficient than periodically profiling storage accounts and containers.

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group