cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Spark read CSV does not throw Exception if the file path is not available in Databricks 14.3

thilanka02
New Contributor II

We were using this method and this was working as expected in Databricks 13.3.

 

 

def read_file():
  try:
    df_temp_dlr_kpi = spark.read.load(raw_path,format="csv", schema=kpi_schema)
    return df_temp_dlr_kpi                          
  except Exception as e:
    error_message = str(e)
    print(error_message)
    dbutils.notebook.exit('Fail')

 

 

But now we have upgraded databricks to 14.3 and now method get returned without catching the exception when the file is not available.

When we call the method, unhandled exception generates as shown below.

Screenshot 2024-04-19 at 13.29.19.png

Can anyone explain why we are experiencing this behaviour, and the solution to catch the exception?

Thanks in advance

1 ACCEPTED SOLUTION

Accepted Solutions

daniel_sahal
Esteemed Contributor

@thilanka02 
It looks like there is a bug in Spark 3.5, nothing much you can do about it rn.

 https://issues.apache.org/jira/browse/SPARK-47708

View solution in original post

2 REPLIES 2

daniel_sahal
Esteemed Contributor

@thilanka02 
It looks like there is a bug in Spark 3.5, nothing much you can do about it rn.

 https://issues.apache.org/jira/browse/SPARK-47708

thilanka02
New Contributor II

Thank you @daniel_sahal for the reply

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.