cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Upgraded cluster to 16.1/16.2 and upload data(append) to elastic index is failling

nishg
New Contributor II

I have updated compute cluster to both databricks version 16.1 and 16.2 and run the workflow to append data into elastic index but it started failing with below error. The same job is working fine with databricks version 15.  Let me know if anyone come across with this issue post updating databricks version to 16.1/16.2

 

Path must be absolute: myindex/_delta_log JVM stacktrace: java.lang.IllegalArgumentException at com.databricks.common.path.AbstractPath$.fromHadoopPath(AbstractPath.scala:114) at com.databricks.backend.daemon.data.filesystem.MountEntryResolver.resolve(MountEntryResolver.scala:52) at com.databricks.backend.daemon.data.client.DBFSOnUCFileSystemResolverImpl.resolveWithSam(DBFSOnUCFileSystemResolverImpl.scala:145) at com.databricks.backend.daemon.data.client.DBFSOnUCFileSystemResolverImpl.resolveAndGetFileSystem(DBFSOnUCFileSystemResolverImpl.scala:195) at com.databricks.backend.daemon.data.client.DBFSV2.resolveAndGetFileSystem(DatabricksFileSystemV2.scala:143) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.resolve(DatabricksFileSystemV2.scala:771) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.$anonfun$getFileStatus$2(DatabricksFileSystemV2.scala:1184) at com.databricks.s3a.S3AExceptionUtils$.convertAWSExceptionToJavaIOException(DatabricksStreamUtils.scala:64) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.$anonfun$getFileStatus$1(DatabricksFileSystemV2.scala:1183) at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:508) at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:613) at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:636) at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:49) at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:295) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:291) at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:47) at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:44) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionContext(DatabricksFileSystemV2.scala:739) at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:96) at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:77) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionTags(DatabricksFileSystemV2.scala:739) at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:608) at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:517) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperationWithResultTags(DatabricksFileSystemV2.scala:739) at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:509) at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:475) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperation(DatabricksFileSystemV2.scala:739) at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.getFileStatus(DatabricksFileSystemV2.scala:1182) at com.databricks.backend.daemon.data.client.DatabricksFileSystem.getFileStatus(DatabricksFileSystem.scala:211) at com.databricks.common.filesystem.LokiFileSystem.getFileStatus(LokiFileSystem.scala:313) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1862) at com.databricks.sql.transaction.tahoe.DeltaTableUtils$.findDeltaTableRootThrowOnError(DeltaTable.scala:339) at com.databricks.sql.transaction.tahoe.DeltaTableUtils$.findDeltaTableRoot(DeltaTable.scala:287) at com.databricks.sql.transaction.tahoe.DeltaTableUtils$.findDeltaTableRoot(DeltaTable.scala:278) at com.databricks.sql.transaction.tahoe.DeltaTableUtils$.findDeltaTableRoot(DeltaTable.scala:270) at com.databricks.sql.transaction.tahoe.DeltaValidation$.validateNonDeltaWrite(DeltaValidation.scala:200) at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:306) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:273) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.handleWriteOperation(SparkConnectPlanner.scala:3540) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.process(SparkConnectPlanner.scala:3047) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.handleCommand(ExecuteThreadRunner.scala:402) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:289) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:210) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:399) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1384) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:399) at org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:97) at org.apache.spark.sql.artifact.ArtifactManager.$anonfun$withResources$1(ArtifactManager.scala:90) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:240) at org.apache.spark.sql.artifact.ArtifactManager.withResources(ArtifactManager.scala:89) at org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:398) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:210) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:129) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.$anonfun$run$2(ExecuteThreadRunner.scala:628) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:51) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104) at com.databricks.unity.HandleImpl.$anonfun$runWithAndClose$1(UCSHandle.scala:109) at scala.util.Using$.resource(Using.scala:269) at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:108) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:628)

0 REPLIES 0

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group