cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Getting "Job aborted due to stage failure" SparkException when trying to download full result

Tahseen0354
Valued Contributor

I have generated a result using SQL. But whenever I try to download the full result (1 million rows), it is throwing SparkException. I can download the preview result but not the full result. Why ? What happens under the hood when I try to download the full result ?

Here is the exception:

SparkException: Job aborted due to stage failure: Task 0 in stage 133.0 failed 4 times, most recent failure: Lost task 0.3 in stage 133.0 (TID 2644) (192.***.x.x executor 6): com.databricks.sql.io.FileReadException: Error while reading file abfss:REDACTED_LOCAL_PART@someurl. It is possible the underlying files have been updated. You can explicitly invalidate the cache in Spark by running 'REFRESH TABLE tableName' command in SQL or by recreating the Dataset/DataFrame involved. If Delta cache is stale or the underlying files have been removed, you can invalidate Delta cache manually by restarting the cluster.

Caused by: FileReadException: Error while reading file abfss:REDACTED_LOCAL_PART@someurl. It is possible the underlying files have been updated. You can explicitly invalidate the cache in Spark by running 'REFRESH TABLE tableName' command in SQL or by recreating the Dataset/DataFrame involved. If Delta cache is stale or the underlying files have been removed, you can invalidate Delta cache manually by restarting the cluster.

Caused by: FileNotFoundException: Operation failed: "The specified path does not exist.", 404, HEAD, https://***.snappy.parquet?upn=false&action=getStatus&timeout=90

Caused by: AbfsRestOperationException: Operation failed: "The specified path does not exist.", 404, HEAD, https://***.snappy.parquet?upn=false&action=getStatus&timeout=90

1 ACCEPTED SOLUTION

Accepted Solutions

Tahseen0354
Valued Contributor

It's working now, I think it was a network issue.

View solution in original post

9 REPLIES 9

Anonymous
Not applicable

@Md Tahseen Anam​ - Hello! My name is Piper and I'm one of the community moderators. Thanks for your question. Let's give it a bit longer to see what the community has to say. Hang in there!

Hi, thank you for your reply. Would be great to get some lights in here.

User16763506477
Contributor III

Hi @Md Tahseen Anam​ are there any updates happening to the table while you are downloading the results?

No update. can it be a network issue ?

hi @Md Tahseen Anam​ ,

Have you try the following steps to re-run your query and get the full results? docs here

Tahseen0354
Valued Contributor

It's working now, I think it was a network issue.

Anonymous
Not applicable

@Md Tahseen Anam​ - Thanks for letting us know. I'm glad things are working!

rpshgupta
New Contributor III

I am also having this issue again and again. I really want to understand what can we do to avoid this?

ac567
New Contributor II

Job aborted due to stage failure: Task 6506 in stage 46.0 failed 4 times, most recent failure: Lost task 6506.3 in stage 46.0 (TID 12896) (10.**.***.*** executor 12): java.lang.OutOfMemoryError: Cannot reserve 4194304 bytes of direct buffer memory (allocated: 5062249863, limit: 5065146368)

I am facing this issue when i run my code in databricks notebook on serverless compute. the code is reading data from table (700 million) and ingesting rows to api in batches, after getting response from api, failed batched i am storing into other table, after ingestion 250 million records i am getiing this error.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group