cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

COPY INTO from Volume failure (rabbit hole)

alwaysmoredata
New Contributor II
hey guys, I am stuck on a loading task, and I simply can't spot what is wrong.
 
The following query fails:
 
COPY INTO `test`.`test_databricks_tokenb3337f88ee667396b15f4e5b2dd5dbb0`.`pipeline_state`
FROM '/Volumes/test/test_databricks_tokenb3337f88ee667396b15f4e5b2dd5dbb0/_temp_load_volume/file_1737154045813408000'
FILEFORMAT = PARQUET;
 
with the following error:
 
python:

 

DatabaseTerminalException(ServerOperationError("The source directory did not contain any parsable files of type PARQUET. Please check the contents of '/Volumes/test/test_databricks_tokenb3337f88ee667396b15f4e5b2dd5dbb0/_temp_load_volume/file_1737154045813408000'."))​

 

databricks query:

 

[COPY_INTO_SOURCE_SCHEMA_INFERENCE_FAILED] The source directory did not contain any parsable files of type PARQUET. Please check the contents of '/Volumes/test/test_databricks_tokenb3337f88ee667396b15f4e5b2dd5dbb0/_temp_load_volume/file_1737154045813408000'.
The error can be silenced by setting 'spark.databricks.delta.copyInto.emptySourceCheck.enabled' to 'false'.

 

and I have downloaded and read the parquet file with pandas - file is perfectly fine...
 
What is wrong? I am stuck... 
 
 
 
 
1 REPLY 1

NandiniN
Databricks Employee
Databricks Employee

I see you are reading just 1 file, ensure that there are no zero-byte files in the directory. Zero-byte files can cause schema inference to fail.

Double-check that the directory contains valid Parquet files using parquet tools. Sometimes, even if the file appears fine when read with pandas, there might be issues with how the file is stored or named in the directory.

 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group