01-24-2022 08:45 PM
Hi, I am getting the following error:
com.databricks.sql.io.FileReadException: Error while reading file wasbs:REDACTED_LOCAL_PART@blobStorageName.blob.core.windows.net/cook/processYear=2021/processMonth=12/processDay=30/processHour=18/part-00003-tid-4178615623264760328.c000.avro.
Caused by: com.microsoft.azure.storage.StorageException: Blob hash mismatch (integrity check failed), Expected value is 8P7bo1mnLPoLxVw==, retrieved bu+CiCkLm/kc6QA==.
where processYear, processMonth, processDay and processHour are partition columns.
however, this is actually just a WARN, and the code still proceeds to execute(also I am able to read this file separately in notebook)... but eventually the job dies due to:
WARN Lost task 9026.0 in stage 324.0 (TID 1525596, 10.139.64.16, executor 83): TaskKilled (Stage cancelled)
I am using the following databricks and spark configs:
RuntimeVersion: 5.5.x-scala2.11
MasterConfiguration:
NodeType: Standard_D32s_v3
NumberOfNodes: 1
WorkerConfiguration:
NodeType: Standard_D32s_v3
NumberOfNodes: 2
This same job is deployed in several other environments too with much more data volume, and it does not fail there. Any idea why it may fail here?
Thanks!
02-08-2022 05:04 PM
Hi @mayuri18kadam@gmail.com ,
This could be a limitation from Spark submit job. Please check the docs from here https://docs.databricks.com/jobs.html#create-a-job please look for the following information:
Important
There are several limitations for spark-submit tasks....
01-25-2022 04:25 AM
can you try to run your code but without the file you get an exception on?
01-26-2022 10:05 AM
yes, I can read from notebook with DBR 6.4, when I specify this path:
wasbs:REDACTED_LOCAL_PART@blobStorageName.blob.core.windows.net/cook/processYear=2021/processMonth=12/processDay=30/processHour=18
but the same using DBR 6.4 from spark-submit, it fails again.. each time complaining of different part files under different partitions.
Also we have the exact same code, with exact same spark configs deployed in several different regions but this is only where we have an issue. Could this be data related, like some part file size limitations for the given spark version?
02-08-2022 05:04 PM
Hi @mayuri18kadam@gmail.com ,
This could be a limitation from Spark submit job. Please check the docs from here https://docs.databricks.com/jobs.html#create-a-job please look for the following information:
Important
There are several limitations for spark-submit tasks....
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.