cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

File not found error.

Data_Analytics1
Contributor III

FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/FileStore/config.share'

When I am trying to read the config.share file, it is throwing this error. I tried with spark path format as well which is dbfs:/FileStore/config.share' but it also didn't work.

Cluster configuration: 11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12) Unity Catalog enabled

14 REPLIES 14

Kaniz_Fatma
Community Manager
Community Manager

Hi @Mahesh Chahareโ€‹ , The FileNotFoundError error occurs when it cannot find the specified file or directory.

In this case, the error is raised when reading the config. share file in Databricks.

It's possible that the file does not exist in the specified path or that the path is incorrect. You can try verifying the path to the file by checking the Databricks workspace UI or by running a command to list the files in the directory.

Check the file path: Verify that the file exists in the specified path by navigating to the Databricks workspace UI and checking the file's location. You can also try listing the files in the directory using the Databricks File System (DBFS) API, for example:

dbutils.fs.ls("/FileStore/")

Check file permissions: Ensure that the user running the code has read permissions for the config.share file. You can check the permissions of the file using the following command: 

dbutils.fs.ls("/FileStore/config.share")

This should display the file details, including the permissions.

Hi @Kaniz Fatmaโ€‹ File is there even it shows full details using dbutils.fs.ls("/FileStore/")

but when I give the same path to delta sharing library, it gives me this error.

import delta_sharing
profile_file = f"/dbfs/FileStore/config.share"
profile = delta_sharing.SharingClient(profile_file)

When executing the above code snippet, FileNotFound error pops up.

Hi @Mahesh Chahareโ€‹,

  • Try reading a different file: Read a separate file from the same directory to see if you can access it. If you can access other files but not the config.share file, it may indicate a problem with the file itself.

  • Check the library version: Make sure that you have the latest version of the delta_sharing library installed. You can do this by running
pip install --upgrade delta-sharing
  • Check if the delta_sharing library is installed : Ensure that the

delta_sharing library is installed on your system. You can check this by running.

pip freeze | grep delta-sharing

  • Try specifying the full path: Instead of using a relative path like
/dbfs/FileStore/config.share

try setting the full path to the file, like

dbfs:/mnt/data/config.share

If none of these solutions work, you may want to contact Databricks support to help you troubleshoot the issue.

Hi @Kaniz Fatmaโ€‹ I can read the same file in below code snippet.

table_url_spark = "dbfs:/FileStore/shares/config.share" + "#share.schema.table"
data_spark_cdf = delta_sharing.load_as_spark(table_url_spark)
data_spark_cdf.count()

But with the same path I am unable to read the file in below code snippet.

import delta_sharing
profile_file = f"dbfs:/FileStore/share/config.share"
profile = delta_sharing.SharingClient(profile_file)

Hi @Mahesh Chahareโ€‹, There might be some confusion with the file path in the second code snippet.

In the first code snippet, you use the dbfs:/FileStore/shares path prefix and append the file path config.share with a table name to access the Delta Sharing table using the delta_sharing.load_as_spark() method. This method loads the table as a Spark DataFrame.

In the second code snippet, you use the dbfs:/FileStore/share path prefix and append the file path config.share to create a Delta Sharing client using the delta_sharing.SharingClient() method.

It looks like the path prefixes are different (shares vs share), which might be causing the issue. Please make sure that the path prefixes match and try again.

Hi @Kaniz Fatmaโ€‹ , By mistake while copy pasting the code in the comment that extra 's' was deleted from shares word. But the problem still exist.

Hi @Mahesh Chahareโ€‹, you may have to contact Databricks support to help you troubleshoot the issue.

pfig
New Contributor II

Something is going on wth DBFS files/paths. I have experienced similar situations, even with simple files and paths. No matter the path format you decide to use; it's not working.

JanaP
New Contributor II

I receive the "File not found error" when reading dbfs file after the cluster was changed. Just with the same file and it's dbfs location there is no error when I change the cluster back.

raghu2
New Contributor III

I have the same issue on a UC enabled cluster Db Version: 14.3 LTS. When I switch to cluster version 10.X there is no error.

raghu2
New Contributor III

Was able to resolve the issue by adding pyOpenSSL library to the cluster.

qigang
New Contributor II

Have you solved?i  have same problem

dkushari
New Contributor III
New Contributor III

Few things to consider here - Cluster type, DBR version. Can you please check if the cluster is shared or assigned? This link might be helpful - https://docs.databricks.com/en/dbfs/unity-catalog.html

jacovangelder
Contributor III

On Unity Catalog Shared Access Mode clusters you need to use a UC Volume to read (config) files using vanilla Python (with open() for example that many libs use). You can no longer read files from DBFS this way. This is all part of the new security model. 

So you would get:

 

 

profile_file = f"/Volumes/Volume_123/share/config.share"

 

Good luck!

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!