Hello Team,
I've encountered an issue while attempting to read a CSV data file into a pandas DataFrame by uploading it into DBFS in the community version of Databricks. Below is the error I encountered along with the code snippet I used:
import pandas as pd
df1 = pd.read_csv("/dbfs/FileStore/shared_uploads/shiv/Dea.csv")
Error Encountered:
FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/FileStore/shared_uploads/shared_uploads/shiv/Dea.csv'
However, when checking for the file using the dbutils.fs.ls command, I can see the file present:
dbutils.fs.ls("/FileStore/shared_uploads/shiv/Dea.csv")
Output:
[FileInfo(path='dbfs:/FileStore/shared_uploads/shiv/Dea.csv', name='Dea.csv', size=18679559, modificationTime=1711631849000)]