- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2024 08:15 PM
I have been using a mounted external s3 bucket with json files up until a few days ago, when my company changed to using all file mounts under control of the unity catalog. Suddenly I can no loner run a command like:
with open("/mnt/my_files/my_json.json", "r") as f_read:
file_stuff = json.loads(f_read)
When I do I get the error:
FileNotFoundError: [Errno 2] No such file or directory: '/mnt/my_files/my_json.json'
If I run a dbutils command, like dbutils.fs.head("/mnt/my_files/my_json.json"), the same path works correctly. Up until we brought the file mount under unity catalog, the with open command worked correctly.
I need to be able to open large json files in my databricks notebook and parse them, because the log files I'm reading come in with multiple large json objects that are not separated by proper json syntax, they are just one after the other in the file. I attempted to use the dbutils.fs.head command to do the parsing operation, but the files are too big and it truncates them so they're incomplete. As far as I can tell, I can't stop that from happening.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-10-2024 04:49 AM
Hi @chrisf_sts ,Thanks for bringing up your concerns, always happy to help 😁
May I know which cluster access mode you are using to run the notebook commands?
Can you please try to run this below command on Single user cluster access mode.
"with open("/mnt/my_files/my_json.json", "r") as f_read:
file_stuff = json.loads(f_read)"
You can refer to this document for more details about the cluster access mode: https://docs.databricks.com/en/compute/configure.html#access-modes
Also, a reason behind the error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. This is a known limitation for Shared Clusters, where /dbfs path is not accessible. You can try using a single-user cluster instead to access /dbfs which supports UC.
Please refer:
https://docs.databricks.com/clusters/configure.html#shared-access-mode-limitations
https://docs.databricks.com/en/dbfs/unity-catalog.html#how-does-dbfs-work-in-shared-access-mode
And we also have a preview feature 'Improved Shared Clusters' that addresses some of the limitations of Shared Clusters.
Please let me know if this helps and leave a like if this helps, followups are appreciated.
Kudos
Ayushi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-10-2024 04:49 AM
Hi @chrisf_sts ,Thanks for bringing up your concerns, always happy to help 😁
May I know which cluster access mode you are using to run the notebook commands?
Can you please try to run this below command on Single user cluster access mode.
"with open("/mnt/my_files/my_json.json", "r") as f_read:
file_stuff = json.loads(f_read)"
You can refer to this document for more details about the cluster access mode: https://docs.databricks.com/en/compute/configure.html#access-modes
Also, a reason behind the error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. This is a known limitation for Shared Clusters, where /dbfs path is not accessible. You can try using a single-user cluster instead to access /dbfs which supports UC.
Please refer:
https://docs.databricks.com/clusters/configure.html#shared-access-mode-limitations
https://docs.databricks.com/en/dbfs/unity-catalog.html#how-does-dbfs-work-in-shared-access-mode
And we also have a preview feature 'Improved Shared Clusters' that addresses some of the limitations of Shared Clusters.
Please let me know if this helps and leave a like if this helps, followups are appreciated.
Kudos
Ayushi

