cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

File not found error.

Data_Analytics1
Contributor III

FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/FileStore/config.share'

When I am trying to read the config.share file, it is throwing this error. I tried with spark path format as well which is dbfs:/FileStore/config.share' but it also didn't work.

Cluster configuration: 11.3 LTS (includes Apache Spark 3.3.0, Scala 2.12) Unity Catalog enabled

11 REPLIES 11

Kaniz
Community Manager
Community Manager

Hi @Mahesh Chahare​ , The FileNotFoundError error occurs when it cannot find the specified file or directory.

In this case, the error is raised when reading the config. share file in Databricks.

It's possible that the file does not exist in the specified path or that the path is incorrect. You can try verifying the path to the file by checking the Databricks workspace UI or by running a command to list the files in the directory.

Check the file path: Verify that the file exists in the specified path by navigating to the Databricks workspace UI and checking the file's location. You can also try listing the files in the directory using the Databricks File System (DBFS) API, for example:

dbutils.fs.ls("/FileStore/")

Check file permissions: Ensure that the user running the code has read permissions for the config.share file. You can check the permissions of the file using the following command: 

dbutils.fs.ls("/FileStore/config.share")

This should display the file details, including the permissions.

Hi @Kaniz Fatma​ File is there even it shows full details using dbutils.fs.ls("/FileStore/")

but when I give the same path to delta sharing library, it gives me this error.

import delta_sharing
profile_file = f"/dbfs/FileStore/config.share"
profile = delta_sharing.SharingClient(profile_file)

When executing the above code snippet, FileNotFound error pops up.

Hi @Mahesh Chahare​,

  • Try reading a different file: Read a separate file from the same directory to see if you can access it. If you can access other files but not the config.share file, it may indicate a problem with the file itself.

  • Check the library version: Make sure that you have the latest version of the delta_sharing library installed. You can do this by running
pip install --upgrade delta-sharing
  • Check if the delta_sharing library is installed : Ensure that the

delta_sharing library is installed on your system. You can check this by running.

pip freeze | grep delta-sharing

  • Try specifying the full path: Instead of using a relative path like
/dbfs/FileStore/config.share

try setting the full path to the file, like

dbfs:/mnt/data/config.share

If none of these solutions work, you may want to contact Databricks support to help you troubleshoot the issue.

Hi @Kaniz Fatma​ I can read the same file in below code snippet.

table_url_spark = "dbfs:/FileStore/shares/config.share" + "#share.schema.table"
data_spark_cdf = delta_sharing.load_as_spark(table_url_spark)
data_spark_cdf.count()

But with the same path I am unable to read the file in below code snippet.

import delta_sharing
profile_file = f"dbfs:/FileStore/share/config.share"
profile = delta_sharing.SharingClient(profile_file)

Hi @Mahesh Chahare​, There might be some confusion with the file path in the second code snippet.

In the first code snippet, you use the dbfs:/FileStore/shares path prefix and append the file path config.share with a table name to access the Delta Sharing table using the delta_sharing.load_as_spark() method. This method loads the table as a Spark DataFrame.

In the second code snippet, you use the dbfs:/FileStore/share path prefix and append the file path config.share to create a Delta Sharing client using the delta_sharing.SharingClient() method.

It looks like the path prefixes are different (shares vs share), which might be causing the issue. Please make sure that the path prefixes match and try again.

Hi @Kaniz Fatma​ , By mistake while copy pasting the code in the comment that extra 's' was deleted from shares word. But the problem still exist.

Hi @Mahesh Chahare​, you may have to contact Databricks support to help you troubleshoot the issue.

pfig
New Contributor II

Something is going on wth DBFS files/paths. I have experienced similar situations, even with simple files and paths. No matter the path format you decide to use; it's not working.

JanaP
New Contributor II

I receive the "File not found error" when reading dbfs file after the cluster was changed. Just with the same file and it's dbfs location there is no error when I change the cluster back.

raghu2
New Contributor III

I have the same issue on a UC enabled cluster Db Version: 14.3 LTS. When I switch to cluster version 10.X there is no error.

raghu2
New Contributor III

Was able to resolve the issue by adding pyOpenSSL library to the cluster.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.