cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

After moving mounted s3 bucket under unity catalog control, python file paths no longer work

chrisf_sts
New Contributor II

I have been using a mounted external s3 bucket with json files up until a few days ago, when my company changed to using all file mounts under control of the unity catalog.  Suddenly I can no loner run a command like:

with open("/mnt/my_files/my_json.json", "r"as f_read:
        file_stuff = json.loads(f_read)

When I do I get the error: 
FileNotFoundError: [Errno 2] No such file or directory: '/mnt/my_files/my_json.json'

If I run a dbutils command, like dbutils.fs.head("/mnt/my_files/my_json.json"), the same path works correctly.  Up until we brought the file mount under unity catalog, the with open command worked correctly.  

I need to be able to open large json files in my databricks notebook and parse them, because the log files I'm reading come in with multiple large json objects that are not separated by proper json syntax, they are just one after the other in the file.  I attempted to use the dbutils.fs.head command to do the parsing operation, but the files are too big and it truncates them so they're incomplete.  As far as I can tell, I can't stop that from happening.  

1 ACCEPTED SOLUTION

Accepted Solutions

Ayushi_Suthar
Honored Contributor
Honored Contributor

Hi @chrisf_sts ,Thanks for bringing up your concerns, always happy to help 😁

May I know which cluster access mode you are using to run the notebook commands?

Can you please try to run this below command on Single user cluster access mode. 

"with open("/mnt/my_files/my_json.json", "r") as f_read:
file_stuff = json.loads(f_read)" 

You can refer to this document for more details about the cluster access mode: https://docs.databricks.com/en/compute/configure.html#access-modes

Also, a reason behind the error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. This is a known limitation for Shared Clusters, where /dbfs path is not accessible. You can try using a single-user cluster instead to access /dbfs which supports UC.

Please refer:
https://docs.databricks.com/clusters/configure.html#shared-access-mode-limitations
https://docs.databricks.com/en/dbfs/unity-catalog.html#how-does-dbfs-work-in-shared-access-mode

And we also have a preview feature 'Improved Shared Clusters' that addresses some of the limitations of Shared Clusters.

Please let me know if this helps and leave a like if this helps, followups are appreciated.
Kudos
Ayushi

View solution in original post

2 REPLIES 2

Ayushi_Suthar
Honored Contributor
Honored Contributor

Hi @chrisf_sts ,Thanks for bringing up your concerns, always happy to help 😁

May I know which cluster access mode you are using to run the notebook commands?

Can you please try to run this below command on Single user cluster access mode. 

"with open("/mnt/my_files/my_json.json", "r") as f_read:
file_stuff = json.loads(f_read)" 

You can refer to this document for more details about the cluster access mode: https://docs.databricks.com/en/compute/configure.html#access-modes

Also, a reason behind the error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. This is a known limitation for Shared Clusters, where /dbfs path is not accessible. You can try using a single-user cluster instead to access /dbfs which supports UC.

Please refer:
https://docs.databricks.com/clusters/configure.html#shared-access-mode-limitations
https://docs.databricks.com/en/dbfs/unity-catalog.html#how-does-dbfs-work-in-shared-access-mode

And we also have a preview feature 'Improved Shared Clusters' that addresses some of the limitations of Shared Clusters.

Please let me know if this helps and leave a like if this helps, followups are appreciated.
Kudos
Ayushi

Kaniz_Fatma
Community Manager
Community Manager

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 
 

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!