cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

File not found error.

Data_Analytics1
Contributor III

FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/FileStore/config.share'

When I am trying to read the config.share file, it is throwing this error. I tried with spark path format as well which is dbfs:/FileStore/config.share' but it also didn't work.

Cluster configuration: 11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12) Unity Catalog enabled

10 REPLIES 10

Hi @Kaniz Fatmaโ€‹ File is there even it shows full details using dbutils.fs.ls("/FileStore/")

but when I give the same path to delta sharing library, it gives me this error.

import delta_sharing
profile_file = f"/dbfs/FileStore/config.share"
profile = delta_sharing.SharingClient(profile_file)

When executing the above code snippet, FileNotFound error pops up.

Hi @Kaniz Fatmaโ€‹ I can read the same file in below code snippet.

table_url_spark = "dbfs:/FileStore/shares/config.share" + "#share.schema.table"
data_spark_cdf = delta_sharing.load_as_spark(table_url_spark)
data_spark_cdf.count()

But with the same path I am unable to read the file in below code snippet.

import delta_sharing
profile_file = f"dbfs:/FileStore/share/config.share"
profile = delta_sharing.SharingClient(profile_file)

Hi @Kaniz Fatmaโ€‹ , By mistake while copy pasting the code in the comment that extra 's' was deleted from shares word. But the problem still exist.

pfig
New Contributor II

Something is going on wth DBFS files/paths. I have experienced similar situations, even with simple files and paths. No matter the path format you decide to use; it's not working.

JanaP
New Contributor II

I receive the "File not found error" when reading dbfs file after the cluster was changed. Just with the same file and it's dbfs location there is no error when I change the cluster back.

raghu2
New Contributor III

I have the same issue on a UC enabled cluster Db Version: 14.3 LTS. When I switch to cluster version 10.X there is no error.

raghu2
New Contributor III

Was able to resolve the issue by adding pyOpenSSL library to the cluster.

qigang
New Contributor II

Have you solved?i  have same problem

dkushari
Databricks Employee
Databricks Employee

Few things to consider here - Cluster type, DBR version. Can you please check if the cluster is shared or assigned? This link might be helpful - https://docs.databricks.com/en/dbfs/unity-catalog.html

jacovangelder
Honored Contributor

On Unity Catalog Shared Access Mode clusters you need to use a UC Volume to read (config) files using vanilla Python (with open() for example that many libs use). You can no longer read files from DBFS this way. This is all part of the new security model. 

So you would get:

 

 

profile_file = f"/Volumes/Volume_123/share/config.share"

 

Good luck!

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group