cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Insert query fails with error "The query is not executed because it tries to launch ***** tasks in a single stage, while maximum allowed tasks one query can launch is 100000;

shan_chandra
Esteemed Contributor
Esteemed Contributor
Py4JJavaError: An error occurred while calling o236.sql. : org.apache.spark.SparkException: Job aborted. at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:201) at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:192) at 
Caused by: java.util.concurrent.ExecutionException: org.apache.spark.SparkException: The query is not executed because it tries to launch 390944 tasks in a single stage, while the maximum allowed tasks one query can launch is 100000; this limit can be modified with configuration parameter "spark.databricks.queryWatchdog.maxQueryTasks".

Please find the summary of the error stack trace. could you please let us know how to resolve this issue?

1 ACCEPTED SOLUTION

Accepted Solutions

shan_chandra
Esteemed Contributor
Esteemed Contributor

could you please increase the below config (at the cluster level) to a higher value or set it to zero

spark.databricks.queryWatchdog.maxQueryTasks 0

The spark config while it alleviates the issue.

View solution in original post

1 REPLY 1

shan_chandra
Esteemed Contributor
Esteemed Contributor

could you please increase the below config (at the cluster level) to a higher value or set it to zero

spark.databricks.queryWatchdog.maxQueryTasks 0

The spark config while it alleviates the issue.