cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

OPTIMIZE: Exception thrown in awaitResult: / by zero

dvmentalmadess
Valued Contributor

We run `OPTIMIZE` on our tables every 24 hours as follows:

spark.sql(f'OPTIMIZE {catalog_name}.{schema_name}.`{table_name}`;') 

This morning one of our hourly jobs started failing on the call to `OPTIMIZE` with the error:

org.apache.spark.SparkException: Exception thrown in awaitResult: / by zero

An initial search doesn't turn anything up for optimize and I'm not sure what can be done since I don't what the inputs would be. I'm wondering if there's some maintenance operation I can run to clean out state somewhere?

And the stack trace:

Py4JJavaError: An error occurred while calling o363.sql.
: org.apache.spark.SparkException: Exception thrown in awaitResult: / by zero
	at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:54)
	at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:462)
	at com.databricks.sql.io.skipping.SimpleOptimizer.execute(Optimizer.scala:204)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner$.binPackForIncrementalClustering(OptimizeRunner.scala:1446)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.$anonfun$optimize$4(OptimizeRunner.scala:739)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.$anonfun$optimize$4$adapted(OptimizeRunner.scala:673)
	at com.databricks.sql.transaction.tahoe.FileMetadataMaterializationTracker$.withTracker(FileMetadataMaterializationTracker.scala:226)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.$anonfun$optimize$1(OptimizeRunner.scala:673)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.withOperationTypeTag(DeltaLogging.scala:196)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.withOperationTypeTag$(DeltaLogging.scala:183)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.withOperationTypeTag(OptimizeRunner.scala:395)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.$anonfun$recordDeltaOperationInternal$2(DeltaLogging.scala:160)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:265)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:263)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.recordFrameProfile(OptimizeRunner.scala:395)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.$anonfun$recordDeltaOperationInternal$1(DeltaLogging.scala:159)
	at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
	at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:666)
	at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:684)
	at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196)
	at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
	at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
	at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:25)
	at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470)
	at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
	at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:25)
	at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:661)
	at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:580)
	at com.databricks.spark.util.PublicDBLogging.recordOperationWithResultTags(DatabricksSparkUsageLogger.scala:25)
	at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:571)
	at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:540)
	at com.databricks.spark.util.PublicDBLogging.recordOperation(DatabricksSparkUsageLogger.scala:25)
	at com.databricks.spark.util.PublicDBLogging.recordOperation0(DatabricksSparkUsageLogger.scala:66)
	at com.databricks.spark.util.DatabricksSparkUsageLogger.recordOperation(DatabricksSparkUsageLogger.scala:148)
	at com.databricks.spark.util.UsageLogger.recordOperation(UsageLogger.scala:72)
	at com.databricks.spark.util.UsageLogger.recordOperation$(UsageLogger.scala:59)
	at com.databricks.spark.util.DatabricksSparkUsageLogger.recordOperation(DatabricksSparkUsageLogger.scala:107)
	at com.databricks.spark.util.UsageLogging.recordOperation(UsageLogger.scala:433)
	at com.databricks.spark.util.UsageLogging.recordOperation$(UsageLogger.scala:412)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.recordOperation(OptimizeRunner.scala:395)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordDeltaOperationInternal(DeltaLogging.scala:158)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordDeltaOperation(DeltaLogging.scala:148)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordDeltaOperation$(DeltaLogging.scala:138)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.recordDeltaOperation(OptimizeRunner.scala:395)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.optimize(OptimizeRunner.scala:636)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.$anonfun$runAndReturnMetricsInternal$3(OptimizeRunner.scala:528)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.$anonfun$runAndReturnMetricsInternal$1(OptimizeRunner.scala:518)
	at com.databricks.sql.acl.CheckPermissions$.trusted(CheckPermissions.scala:2047)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.runAndReturnMetricsInternal(OptimizeRunner.scala:479)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.runAndReturnMetricsWithConflictChecking(OptimizeRunner.scala:460)
	at com.databricks.sql.transaction.tahoe.commands.optimize.OptimizeRunner.runAndReturnMetrics(OptimizeRunner.scala:447)
	at com.databricks.sql.transaction.tahoe.commands.OptimizeTableCommandEdge.$anonfun$run$1(OptimizeTableCommandEdge.scala:149)
	at com.databricks.sql.acl.CheckPermissions$.trusted(CheckPermissions.scala:2047)
	at com.databricks.sql.transaction.tahoe.commands.OptimizeTableCommandEdge.run(OptimizeTableCommandEdge.scala:107)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:82)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:80)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:79)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:91)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:272)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:272)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$8(SQLExecution.scala:282)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:510)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withCustomExecutionEnv$1(SQLExecution.scala:209)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1113)
	at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:152)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:459)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:271)
	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:245)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:266)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:251)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:465)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:69)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:465)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:39)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:316)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:312)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:39)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:39)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:441)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:251)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:372)
	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:251)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:203)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:200)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:262)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:123)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1113)
	at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1120)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1120)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:113)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:837)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1113)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:826)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:859)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:397)
	at py4j.Gateway.invoke(Gateway.java:306)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:195)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:115)
	at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.ArithmeticException: / by zero
	at com.databricks.sql.io.skipping.KdTreeClustering$.createKdTreeForDf(KdTreeClustering.scala:322)
	at com.databricks.sql.io.skipping.SimpleOptimizer.pushDownFiles(Optimizer.scala:730)
	at com.databricks.sql.io.skipping.SimpleOptimizer.$anonfun$executeInternal$1(Optimizer.scala:318)
	at com.databricks.spark.util.IdentityClaim$.withClaim(IdentityClaim.scala:48)
	at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper.$anonfun$runWithCaptured$4(SparkThreadLocalForwardingThreadPoolExecutor.scala:81)
	at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41)
	at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:80)
	at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper.runWithCaptured$(SparkThreadLocalForwardingThreadPoolExecutor.scala:66)
	at org.apache.spark.util.threads.CapturedSparkThreadLocals.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:94)
	at com.databricks.sql.io.skipping.liquid.LiquidThreadPool.$anonfun$submit$1(LiquidThreadPool.scala:41)
	at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659)
	at scala.util.Success.$anonfun$map$1(Try.scala:255)
	at scala.util.Success.map(Try.scala:213)
	at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
	at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
	at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
	at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.$anonfun$run$1(SparkThreadLocalForwardingThreadPoolExecutor.scala:118)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at com.databricks.spark.util.IdentityClaim$.withClaim(IdentityClaim.scala:48)
	at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper.$anonfun$runWithCaptured$4(SparkThreadLocalForwardingThreadPoolExecutor.scala:81)
	at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:41)
	at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:80)
	at org.apache.spark.util.threads.SparkThreadLocalCapturingHelper.runWithCaptured$(SparkThreadLocalForwardingThreadPoolExecutor.scala:66)
	at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.runWithCaptured(SparkThreadLocalForwardingThreadPoolExecutor.scala:115)
	at org.apache.spark.util.threads.SparkThreadLocalCapturingRunnable.run(SparkThreadLocalForwardingThreadPoolExecutor.scala:118)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more

 

1 ACCEPTED SOLUTION

Accepted Solutions

dvmentalmadess
Valued Contributor

The job that was triggering this was an hourly job and the OPTIMIZE operation finally succeeded on the third attempt. I've submitted a ticket to support so hopefully this gets logged as a bug, but at the very least it appears to be a transient issue for now.

View solution in original post

3 REPLIES 3

dvmentalmadess
Valued Contributor

The job that was triggering this was an hourly job and the OPTIMIZE operation finally succeeded on the third attempt. I've submitted a ticket to support so hopefully this gets logged as a bug, but at the very least it appears to be a transient issue for now.

tnyz
New Contributor II

I am getting the same issue on 14.3

sh
New Contributor II

I am getting same error. Any resolution

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.