cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to read file from dbfs location in databricks.

Data_Engineer3
Contributor II

When i tried to read file from dbfs, it throws error - Caused by: FileReadException: Error while reading file dbfs:/.......................parquet is not a Parquet file. Expected magic number at tail [80, 65, 82, 49] but found [105, 108, 101, 115].

But I checked the file location and types and alll it shows Parquet format and data present inside. I couldn't get idea of it.

1 ACCEPTED SOLUTION

Accepted Solutions

Are you able to list all the files in this folder location by using dbutils? You might need to read the Parquet file using the following examples https://docs.databricks.com/data/data-sources/read-parquet.html

View solution in original post

4 REPLIES 4

Kaniz
Community Manager
Community Manager

Hi @KARTHICK N​, What's the one-line code you're trying to read the file, precisely the path?

  • Can you confirm if your file is a CSV or Parquet file?
  • Are you trying to read it in python or scala?

Hi @Kaniz Fatma (Databricks)​ ,

Command, I used spark.sql command to read table data, where data is getting stored as parquet format.

I am trying to read data from dbfs location, its a parquet file only. I have cross checked with by doing ls command file is present.

Are you able to list all the files in this folder location by using dbutils? You might need to read the Parquet file using the following examples https://docs.databricks.com/data/data-sources/read-parquet.html

No, I am not able to read file by dbutils command itself. It throwing same error.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.