How to handle java.io.Exception in python notebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-04-2022 11:46 PM
I'm attempting to mount a volume using dbutils.fs.mount in a python workbook
in the exception handling for this statement, I have found an exception that doesn't get caught using the standard try/except handling
for example, if passing through a container name that does not exist I get the following exception - but it is not caught
ExecutionError: An error occurred while calling o390.mount.
: java.io.FileNotFoundException: /: No such file or directory.
Is there an approach to catch a java.io.Exception within a python notebook?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-06-2022 01:22 AM
Hi @Stuart Parker , Could you please check in the Data/DBFS page if the file is there (or via dbutils.fs.ls, https://docs.databricks.com/dev-tools/databricks-utils.html)?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-13-2022 06:36 PM
Hi @Debayan Mukherjee
The issue is not a file one. In the workbook that mounted the container, the container name was mis-typed. Despite the mount command being wrapped in a try..catch.. block, it was not caught and this error bubbled up and threw an exception that stopped the workbook.
My question relates to how to handle future events like this where standard pythonic exception handling doesn't work
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-13-2022 08:29 AM
If except Exception as error: is not working, you can check first if container name exists as resource
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-13-2022 06:38 PM
Thanks - you are correct - however I am attempting to determine if it is possible to catch the Java error that is bubbling up within the pyspark environment
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-12-2022 09:51 PM
Hi @Stuart Parker
Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.
We'd love to hear from you.
Thanks!