Hi @chrisf_sts ,Thanks for bringing up your concerns, always happy to help ๐
May I know which cluster access mode you are using to run the notebook commands?
Can you please try to run this below command on Single user cluster access mode.
"with open("/mnt/my_files/my_json.json", "r") as f_read:
file_stuff = json.loads(f_read)"
You can refer to this document for more details about the cluster access mode: https://docs.databricks.com/en/compute/configure.html#access-modes
Also, a reason behind the error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. This is a known limitation for Shared Clusters, where /dbfs path is not accessible. You can try using a single-user cluster instead to access /dbfs which supports UC.
Please refer:
https://docs.databricks.com/clusters/configure.html#shared-access-mode-limitations
https://docs.databricks.com/en/dbfs/unity-catalog.html#how-does-dbfs-work-in-shared-access-mode
And we also have a preview feature 'Improved Shared Clusters' that addresses some of the limitations of Shared Clusters.
Please let me know if this helps and leave a like if this helps, followups are appreciated.
Kudos
Ayushi