cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

After moving mounted s3 bucket under unity catalog control, python file paths no longer work

chrisf_sts
New Contributor II

I have been using a mounted external s3 bucket with json files up until a few days ago, when my company changed to using all file mounts under control of the unity catalog.  Suddenly I can no loner run a command like:

with open("/mnt/my_files/my_json.json", "r"as f_read:
        file_stuff = json.loads(f_read)

When I do I get the error: 
FileNotFoundError: [Errno 2] No such file or directory: '/mnt/my_files/my_json.json'

If I run a dbutils command, like dbutils.fs.head("/mnt/my_files/my_json.json"), the same path works correctly.  Up until we brought the file mount under unity catalog, the with open command worked correctly.  

I need to be able to open large json files in my databricks notebook and parse them, because the log files I'm reading come in with multiple large json objects that are not separated by proper json syntax, they are just one after the other in the file.  I attempted to use the dbutils.fs.head command to do the parsing operation, but the files are too big and it truncates them so they're incomplete.  As far as I can tell, I can't stop that from happening.  

1 ACCEPTED SOLUTION

Accepted Solutions

Ayushi_Suthar
Databricks Employee
Databricks Employee

Hi @chrisf_sts ,Thanks for bringing up your concerns, always happy to help 😁

May I know which cluster access mode you are using to run the notebook commands?

Can you please try to run this below command on Single user cluster access mode. 

"with open("/mnt/my_files/my_json.json", "r") as f_read:
file_stuff = json.loads(f_read)" 

You can refer to this document for more details about the cluster access mode: https://docs.databricks.com/en/compute/configure.html#access-modes

Also, a reason behind the error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. This is a known limitation for Shared Clusters, where /dbfs path is not accessible. You can try using a single-user cluster instead to access /dbfs which supports UC.

Please refer:
https://docs.databricks.com/clusters/configure.html#shared-access-mode-limitations
https://docs.databricks.com/en/dbfs/unity-catalog.html#how-does-dbfs-work-in-shared-access-mode

And we also have a preview feature 'Improved Shared Clusters' that addresses some of the limitations of Shared Clusters.

Please let me know if this helps and leave a like if this helps, followups are appreciated.
Kudos
Ayushi

View solution in original post

1 REPLY 1

Ayushi_Suthar
Databricks Employee
Databricks Employee

Hi @chrisf_sts ,Thanks for bringing up your concerns, always happy to help 😁

May I know which cluster access mode you are using to run the notebook commands?

Can you please try to run this below command on Single user cluster access mode. 

"with open("/mnt/my_files/my_json.json", "r") as f_read:
file_stuff = json.loads(f_read)" 

You can refer to this document for more details about the cluster access mode: https://docs.databricks.com/en/compute/configure.html#access-modes

Also, a reason behind the error while trying to access the external dbfs mount file using "with open" is that you are using a shared access mode cluster. This is a known limitation for Shared Clusters, where /dbfs path is not accessible. You can try using a single-user cluster instead to access /dbfs which supports UC.

Please refer:
https://docs.databricks.com/clusters/configure.html#shared-access-mode-limitations
https://docs.databricks.com/en/dbfs/unity-catalog.html#how-does-dbfs-work-in-shared-access-mode

And we also have a preview feature 'Improved Shared Clusters' that addresses some of the limitations of Shared Clusters.

Please let me know if this helps and leave a like if this helps, followups are appreciated.
Kudos
Ayushi

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group