โ05-10-2024 06:20 AM
Library installation failed for library due to user error for jar: \"dbfs:////<<PATH>>/jackson-annotations-2.16.1.jar\"\n Error messages:\nLibrary installation attempted on the driver node of cluster <<clusterId>> and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: java.lang.IllegalArgumentException: requirement failed: File jackson_annotations_2_16_1.jar was already registered with a different path (old path = /local_disk0/tmp/addedFile3ec2ccf585294f668470b238801344c96148554371564875980/jackson_annotations_2_16_1.jar, new path = /local_disk0/tmp/addedFile8887962e39154abab2dcd271b871652d6673451483775455291/jackson_annotations_2_16_1.jar"
โ05-14-2024 08:34 AM
@pavansharma36 Thanks for the details.
I had a look on runtime and platform release notes and I can't find nothing that could explain a change of behavior. I can only suppose that background changes happened but guessing is not fact.
It's only an opinion but :
It would be great if we could have a release note with package management changes because with the progressive removal of DBFS, changes will continue to occur.
โ05-13-2024 08:25 AM
Hi @pavansharma36,
Actually, you cannot install the same version a jar/package more than once. If for some reason, your existing package needs to be re-installed, remove it, restart your cluster and install it again.
To have more context :
โ05-14-2024 12:20 AM
Issue is with both interactive and ephemeral cluster, Issue is with previous runtime version it used to work but started failing from runtime version 14.3
We install some default libs on cluster creation and also pass all libraries required by job during submission. So jackson is installed during cluster creation and in job libraries,
since libs are from dbfs and point to same jar shouldn't be issue right?
Is there any configuration added to check duplicate jar from full path instead of just name.
Thanks
โ05-14-2024 08:34 AM
@pavansharma36 Thanks for the details.
I had a look on runtime and platform release notes and I can't find nothing that could explain a change of behavior. I can only suppose that background changes happened but guessing is not fact.
It's only an opinion but :
It would be great if we could have a release note with package management changes because with the progressive removal of DBFS, changes will continue to occur.
โ05-30-2024 11:15 PM
Hi @Edouard_JH
Adding more details on this issue.
We faced this issue with several other jars in databricks 14.3, adding the error stacktrace for the same, seems like the error comes from changes made under https://issues.apache.org/jira/browse/SPARK-35691.
This issue seems to be specific to 14.3 and not reproducible in the older version of databricks
java.lang.IllegalArgumentException: requirement failed: File mssql_jdbc_12_6_0_jre8.jar was already registered with a different path (old path = /local_disk0/tmp/addedFileac4ec1d06f654f9b97e61540165d09b85339972689888734088/mssql_jdbc_12_6_0_jre8.jar, new path = /local_disk0/tmp/addedFile3ee85bdc97024c9b9e7132ff46f5979d5442393956418628834/mssql_jdbc_12_6_0_jre8.jar
at scala.Predef$.require(Predef.scala:281)
at org.apache.spark.rpc.netty.NettyStreamManager.addFile(NettyStreamManager.scala:76)
at org.apache.spark.SparkContext.addFile(SparkContext.scala:2225)
at org.apache.spark.SparkContext.addFile(SparkContext.scala:2121)
at com.databricks.backend.daemon.driver.SharedDriverContext.$anonfun$addNewLibrary$1(SharedDriverContext.scala:369)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.SafeAddJarOrFile$.safe(SafeAddJarOrFile.scala:31)
at com.databricks.backend.daemon.driver.SharedDriverContext.addNewLibrary(SharedDriverContext.scala:368)
at com.databricks.backend.daemon.driver.SharedDriverContext.attachLibraryToSpark(SharedDriverContext.scala:507)
at com.databricks.backend.daemon.driver.SharedDriverContext.$anonfun$attachLibrariesToSpark$2(SharedDriverContext.scala:460)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:573)
at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:669)
at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:687)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.backend.daemon.driver.SharedDriverContext.withAttributionContext(SharedDriverContext.scala:156)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
at com.databricks.backend.daemon.driver.SharedDriverContext.withAttributionTags(SharedDriverContext.scala:156)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:664)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:582)
at com.databricks.backend.daemon.driver.SharedDriverContext.recordOperationWithResultTags(SharedDriverContext.scala:156)
at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:573)
at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:542)
at com.databricks.backend.daemon.driver.SharedDriverContext.recordOperation(SharedDriverContext.scala:156)
at com.databricks.backend.daemon.driver.SharedDriverContext.$anonfun$attachLibrariesToSpark$1(SharedDriverContext.scala:452)
at com.databricks.backend.daemon.driver.SharedDriverContext.$anonfun$attachLibrariesToSpark$1$adapted(SharedDriverContext.scala:437)
at scala.collection.immutable.List.foreach(List.scala:431)
at com.databricks.backend.daemon.driver.SharedDriverContext.attachLibrariesToSpark(SharedDriverContext.scala:437)
at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$handleRequest$9(DriverCorral.scala:815)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:128)
at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$handleRequest$8(DriverCorral.scala:814)
at com.databricks.backend.daemon.driver.DriverCorral.$anonfun$handleRequest$8$adapted(DriverCorral.scala:814)
at scala.util.Using$.resource(Using.scala:269)
at com.databricks.backend.daemon.driver.DriverCorral.com$databricks$backend$daemon$driver$DriverCorral$$handleRequest(DriverCorral.scala:814)
at com.databricks.backend.daemon.driver.DriverCorral$$anonfun$receive$1.applyOrElse(DriverCorral.scala:1228)
at com.databricks.backend.daemon.driver.DriverCorral$$anonfun$receive$1.applyOrElse(DriverCorral.scala:1224)
at com.databricks.rpc.ServerBackend.$anonfun$internalReceive0$2(ServerBackend.scala:174)
at com.databricks.rpc.ServerBackend$$anonfun$commonReceive$1.applyOrElse(ServerBackend.scala:200)
at com.databricks.rpc.ServerBackend$$anonfun$commonReceive$1.applyOrElse(ServerBackend.scala:200)
at com.databricks.rpc.ServerBackend.internalReceive0(ServerBackend.scala:171)
at com.databricks.rpc.ServerBackend.$anonfun$internalReceive$1(ServerBackend.scala:147)
at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:573)
at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:669)
at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:687)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.rpc.ServerBackend.withAttributionContext(ServerBackend.scala:22)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:472)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455)
at com.databricks.rpc.ServerBackend.withAttributionTags(ServerBackend.scala:22)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:664)
at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:582)
at com.databricks.rpc.ServerBackend.recordOperationWithResultTags(ServerBackend.scala:22)
at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:573)
at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:542)
at com.databricks.rpc.ServerBackend.recordOperation(ServerBackend.scala:22)
at com.databricks.rpc.ServerBackend.internalReceive(ServerBackend.scala:147)
at com.databricks.rpc.JettyServer$RequestManager.handleRPC(JettyServer.scala:1020)
at com.databricks.rpc.JettyServer$RequestManager.handleRequestAndRespond(JettyServer.scala:941)
at com.databricks.rpc.JettyServer$RequestManager.$anonfun$handleHttp$6(JettyServer.scala:545)
at com.databricks.rpc.JettyServer$RequestManager.$anonfun$handleHttp$6$adapted(JettyServer.scala:514)
at com.databricks.logging.activity.ActivityContextFactory$.$anonfun$withActivityInternal$4(ActivityContextFactory.scala:405)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.logging.activity.ActivityContextFactory$.withAttributionContext(ActivityContextFactory.scala:58)
at com.databricks.logging.activity.ActivityContextFactory$.$anonfun$withActivityInternal$1(ActivityContextFactory.scala:405)
at com.databricks.context.integrity.IntegrityCheckContext$ThreadLocalStorage$.withValue(IntegrityCheckContext.scala:44)
at com.databricks.logging.activity.ActivityContextFactory$.withActivityInternal(ActivityContextFactory.scala:380)
at com.databricks.logging.activity.ActivityContextFactory$.withServiceRequestActivity(ActivityContextFactory.scala:159)
at com.databricks.rpc.JettyServer$RequestManager.handleHttp(JettyServer.scala:514)
at com.databricks.rpc.JettyServer$RequestManager.doPost(JettyServer.scala:404)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:523)
at com.databricks.rpc.HttpServletWithPatch.service(HttpServletWithPatch.scala:33)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:590)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:554)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(Server.java:516)
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)
at com.databricks.rpc.InstrumentedQueuedThreadPool$$anon$1.$anonfun$run$4(InstrumentedQueuedThreadPool.scala:104)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:216)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418)
at com.databricks.rpc.InstrumentedQueuedThreadPool.withAttributionContext(InstrumentedQueuedThreadPool.scala:47)
at com.databricks.rpc.InstrumentedQueuedThreadPool$$anon$1.$anonfun$run$1(InstrumentedQueuedThreadPool.scala:104)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.instrumentation.QueuedThreadPoolInstrumenter.trackActiveThreads(QueuedThreadPoolInstrumenter.scala:66)
at com.databricks.instrumentation.QueuedThreadPoolInstrumenter.trackActiveThreads$(QueuedThreadPoolInstrumenter.scala:63)
at com.databricks.rpc.InstrumentedQueuedThreadPool.trackActiveThreads(InstrumentedQueuedThreadPool.scala:47)
at com.databricks.rpc.InstrumentedQueuedThreadPool$$anon$1.run(InstrumentedQueuedThreadPool.scala:86)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
at java.lang.Thread.run(Thread.java:750)
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group