cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Spark read CSV does not throw Exception if the file path is not available in Databricks 14.3

thilanka02
New Contributor II

We were using this method and this was working as expected in Databricks 13.3.

 

 

def read_file():
  try:
    df_temp_dlr_kpi = spark.read.load(raw_path,format="csv", schema=kpi_schema)
    return df_temp_dlr_kpi                          
  except Exception as e:
    error_message = str(e)
    print(error_message)
    dbutils.notebook.exit('Fail')

 

 

But now we have upgraded databricks to 14.3 and now method get returned without catching the exception when the file is not available.

When we call the method, unhandled exception generates as shown below.

Screenshot 2024-04-19 at 13.29.19.png

Can anyone explain why we are experiencing this behaviour, and the solution to catch the exception?

Thanks in advance

1 ACCEPTED SOLUTION

Accepted Solutions

daniel_sahal
Esteemed Contributor

@thilanka02 
It looks like there is a bug in Spark 3.5, nothing much you can do about it rn.

 https://issues.apache.org/jira/browse/SPARK-47708

View solution in original post

3 REPLIES 3

daniel_sahal
Esteemed Contributor

@thilanka02 
It looks like there is a bug in Spark 3.5, nothing much you can do about it rn.

 https://issues.apache.org/jira/browse/SPARK-47708

thilanka02
New Contributor II

Thank you @daniel_sahal for the reply

databricks100
New Contributor II

Hi, has this been resolved? I am still seeing this issue with Runtime 14.3 LTS

Thanks in advance.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now