@Hritik_Moon
Try to read the file as delta.
path/delta_file_name/
- parquet files
- delta_log/
since you are using spark, use this, spark.read.format("delta").load("path/delta_file_name").
Delta internally stores the data as parquet and delta log contains the metadata of transactions. You don't need to touch these files unless you are experimenting. 🙂
For more info, please go through this, https://docs.databricks.com/aws/en/delta/tutorial.
Hope this solved your issue.