<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Error loading model from mlflow: java.io.StreamCorruptedException: invalid type code: 00 in Machine Learning</title>
    <link>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28808#M1594</link>
    <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I'm using, in my IDE, Databricks Connect version 9.1LTS ML to connect to a databricks cluster with spark version 3.1 and download a spark model that's been trained and saved using mlflow.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;So it seems like it's able to find a copy the model, but then it's wrong. The same&amp;nbsp;&lt;B&gt;works in a databricks notebook are goods&lt;/B&gt;, the problem only occurs using databricks connect in my IDE.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;We are getting the same error in different repositories with different models. It started to appear recently.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have the same problem in other environment with cluster 10.4LTS ML and databricks-connect 10.4.6.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Do you have an idea ?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;code :&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;mlflow.set_tracking_uri(&lt;B&gt;"databricks"&lt;/B&gt;)&lt;/P&gt;&lt;P&gt;model_path = &lt;B&gt;'dbfs:/databricks/mlflow-tracking/197830957424395/7c5e692873874dadae4f67f44c1aa310/artifacts/rfModel'&lt;/B&gt;&lt;/P&gt;&lt;P&gt;model_res = mlflow.spark.load_model(model_path)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;See the StackTraceError :&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;2022/10/06 15:17:11 INFO mlflow.spark: File 'dbfs:/databricks/mlflow-tracking/197830957424395/7c5e692873874dadae4f67f44c1aa310/artifacts/rfModel/sparkml' not found on DFS. Will attempt to upload the file.&lt;/P&gt;&lt;P&gt;22/10/06 15:17:39 WARN DBFS: DBFS create on /tmp/mlflow/f020cb9a-47b2-49ee-8b12-cf2754db61a9/metadata/part-00000 took 2299 ms&lt;/P&gt;&lt;P&gt;22/10/06 15:17:42 WARN DBFS: DBFS create on /tmp/mlflow/f020cb9a-47b2-49ee-8b12-cf2754db61a9/metadata/_SUCCESS took 1687 ms&lt;/P&gt;&lt;P&gt;22/10/06 15:17:46 WARN DBFS: DBFS mkdirs on /tmp/mlflow/f020cb9a-47b2-49ee-8b12-cf2754db61a9/stages/0_RandomForestClassifier_77e9017cbf4d took 2302 ms&lt;/P&gt;&lt;P&gt;2022/10/06 15:19:13 INFO mlflow.spark: Copied SparkML model to /tmp/mlflow/f020cb9a-47b2-49ee-8b12-cf2754db61a9&lt;/P&gt;&lt;P&gt;View job details at ........https....&lt;/P&gt;&lt;P&gt;View job details at ........ https .....&lt;/P&gt;&lt;P&gt;22/10/06 15:19:16 ERROR Instrumentation: java.io.StreamCorruptedException: invalid type code: 00&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1698)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)&lt;/P&gt;&lt;P&gt;	at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:488)&lt;/P&gt;&lt;P&gt;	at sun.reflect.GeneratedMethodAccessor419.invoke(Unknown Source)&lt;/P&gt;&lt;P&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;/P&gt;&lt;P&gt;	at java.lang.reflect.Method.invoke(Method.java:498)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1184)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2296)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2093)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1655)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2093)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1655)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.sql.util.ProtoSerializer.$anonfun$deserializeObject$1(ProtoSerializer.scala:6631)&lt;/P&gt;&lt;P&gt;	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.sql.util.ProtoSerializer.deserializeObject(ProtoSerializer.scala:6616)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.execute0(SparkServiceRPCHandler.scala:728)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.$anonfun$executeRPC0$1(SparkServiceRPCHandler.scala:477)&lt;/P&gt;&lt;P&gt;	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.executeRPC0(SparkServiceRPCHandler.scala:372)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler$$anon$2.call(SparkServiceRPCHandler.scala:323)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler$$anon$2.call(SparkServiceRPCHandler.scala:309)&lt;/P&gt;&lt;P&gt;	at java.util.concurrent.FutureTask.run(FutureTask.java:266)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.$anonfun$executeRPC$1(SparkServiceRPCHandler.scala:359)&lt;/P&gt;&lt;P&gt;	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.executeRPC(SparkServiceRPCHandler.scala:336)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCServlet.doPost(SparkServiceRPCServer.scala:167)&lt;/P&gt;&lt;P&gt;	at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)&lt;/P&gt;&lt;P&gt;	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:550)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.Server.handle(Server.java:516)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:388)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:633)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:380)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:383)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:882)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1036)&lt;/P&gt;&lt;P&gt;	at java.lang.Thread.run(Thread.java:748)&lt;/P&gt;&lt;P&gt;...&lt;/P&gt;&lt;P&gt;py4j.protocol.Py4JJavaError: An error occurred while calling o588.load.&lt;/P&gt;&lt;P&gt;: java.io.StreamCorruptedException: invalid type code: 00&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1698)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks for your help.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 06 Oct 2022 13:38:43 GMT</pubDate>
    <dc:creator>NSRBX</dc:creator>
    <dc:date>2022-10-06T13:38:43Z</dc:date>
    <item>
      <title>Error loading model from mlflow: java.io.StreamCorruptedException: invalid type code: 00</title>
      <link>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28808#M1594</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I'm using, in my IDE, Databricks Connect version 9.1LTS ML to connect to a databricks cluster with spark version 3.1 and download a spark model that's been trained and saved using mlflow.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;So it seems like it's able to find a copy the model, but then it's wrong. The same&amp;nbsp;&lt;B&gt;works in a databricks notebook are goods&lt;/B&gt;, the problem only occurs using databricks connect in my IDE.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;We are getting the same error in different repositories with different models. It started to appear recently.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have the same problem in other environment with cluster 10.4LTS ML and databricks-connect 10.4.6.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Do you have an idea ?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;code :&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;mlflow.set_tracking_uri(&lt;B&gt;"databricks"&lt;/B&gt;)&lt;/P&gt;&lt;P&gt;model_path = &lt;B&gt;'dbfs:/databricks/mlflow-tracking/197830957424395/7c5e692873874dadae4f67f44c1aa310/artifacts/rfModel'&lt;/B&gt;&lt;/P&gt;&lt;P&gt;model_res = mlflow.spark.load_model(model_path)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;See the StackTraceError :&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;2022/10/06 15:17:11 INFO mlflow.spark: File 'dbfs:/databricks/mlflow-tracking/197830957424395/7c5e692873874dadae4f67f44c1aa310/artifacts/rfModel/sparkml' not found on DFS. Will attempt to upload the file.&lt;/P&gt;&lt;P&gt;22/10/06 15:17:39 WARN DBFS: DBFS create on /tmp/mlflow/f020cb9a-47b2-49ee-8b12-cf2754db61a9/metadata/part-00000 took 2299 ms&lt;/P&gt;&lt;P&gt;22/10/06 15:17:42 WARN DBFS: DBFS create on /tmp/mlflow/f020cb9a-47b2-49ee-8b12-cf2754db61a9/metadata/_SUCCESS took 1687 ms&lt;/P&gt;&lt;P&gt;22/10/06 15:17:46 WARN DBFS: DBFS mkdirs on /tmp/mlflow/f020cb9a-47b2-49ee-8b12-cf2754db61a9/stages/0_RandomForestClassifier_77e9017cbf4d took 2302 ms&lt;/P&gt;&lt;P&gt;2022/10/06 15:19:13 INFO mlflow.spark: Copied SparkML model to /tmp/mlflow/f020cb9a-47b2-49ee-8b12-cf2754db61a9&lt;/P&gt;&lt;P&gt;View job details at ........https....&lt;/P&gt;&lt;P&gt;View job details at ........ https .....&lt;/P&gt;&lt;P&gt;22/10/06 15:19:16 ERROR Instrumentation: java.io.StreamCorruptedException: invalid type code: 00&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1698)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)&lt;/P&gt;&lt;P&gt;	at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:488)&lt;/P&gt;&lt;P&gt;	at sun.reflect.GeneratedMethodAccessor419.invoke(Unknown Source)&lt;/P&gt;&lt;P&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;/P&gt;&lt;P&gt;	at java.lang.reflect.Method.invoke(Method.java:498)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1184)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2296)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2093)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1655)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2093)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1655)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2405)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2329)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2187)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1667)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.sql.util.ProtoSerializer.$anonfun$deserializeObject$1(ProtoSerializer.scala:6631)&lt;/P&gt;&lt;P&gt;	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)&lt;/P&gt;&lt;P&gt;	at org.apache.spark.sql.util.ProtoSerializer.deserializeObject(ProtoSerializer.scala:6616)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.execute0(SparkServiceRPCHandler.scala:728)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.$anonfun$executeRPC0$1(SparkServiceRPCHandler.scala:477)&lt;/P&gt;&lt;P&gt;	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.executeRPC0(SparkServiceRPCHandler.scala:372)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler$$anon$2.call(SparkServiceRPCHandler.scala:323)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler$$anon$2.call(SparkServiceRPCHandler.scala:309)&lt;/P&gt;&lt;P&gt;	at java.util.concurrent.FutureTask.run(FutureTask.java:266)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.$anonfun$executeRPC$1(SparkServiceRPCHandler.scala:359)&lt;/P&gt;&lt;P&gt;	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCHandler.executeRPC(SparkServiceRPCHandler.scala:336)&lt;/P&gt;&lt;P&gt;	at com.databricks.service.SparkServiceRPCServlet.doPost(SparkServiceRPCServer.scala:167)&lt;/P&gt;&lt;P&gt;	at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)&lt;/P&gt;&lt;P&gt;	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:550)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:190)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.Server.handle(Server.java:516)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:388)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:633)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:380)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:383)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:882)&lt;/P&gt;&lt;P&gt;	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1036)&lt;/P&gt;&lt;P&gt;	at java.lang.Thread.run(Thread.java:748)&lt;/P&gt;&lt;P&gt;...&lt;/P&gt;&lt;P&gt;py4j.protocol.Py4JJavaError: An error occurred while calling o588.load.&lt;/P&gt;&lt;P&gt;: java.io.StreamCorruptedException: invalid type code: 00&lt;/P&gt;&lt;P&gt;	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1698)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks for your help.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 06 Oct 2022 13:38:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28808#M1594</guid>
      <dc:creator>NSRBX</dc:creator>
      <dc:date>2022-10-06T13:38:43Z</dc:date>
    </item>
    <item>
      <title>Re: Error loading model from mlflow: java.io.StreamCorruptedException: invalid type code: 00</title>
      <link>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28809#M1595</link>
      <description>&lt;P&gt;@SERET Nathalie​&amp;nbsp;- The client needs to be updated to latest version to fix this issue:&amp;nbsp;&lt;A href="https://pypi.org/project/databricks-connect/#history" alt="https://pypi.org/project/databricks-connect/#history" target="_blank"&gt;https://pypi.org/project/databricks-connect/#history&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 12 Oct 2022 15:46:01 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28809#M1595</guid>
      <dc:creator>shan_chandra</dc:creator>
      <dc:date>2022-10-12T15:46:01Z</dc:date>
    </item>
    <item>
      <title>Re: Error loading model from mlflow: java.io.StreamCorruptedException: invalid type code: 00</title>
      <link>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28811#M1597</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I changed my databricks-connect version 10.4.12, mlflow version is 1.26 but it doesn't work.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have winutils.exe in my venv under Lib\site-packages\pyspark\bin.&lt;/P&gt;&lt;P&gt;and my environment variable HADOOP_HOME is ok.&lt;/P&gt;&lt;P&gt;Version python 3.8.10.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks for your help.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;See the stacktraceerror :&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;2022/10/14 17:18:09 INFO mlflow.spark: File 'dbfs:/databricks/mlflow-tracking/67260056032267/6580b479a0ba43beaa3dd7971561fbb7/artifacts/model_rf/sparkml' not found on DFS. Will attempt to upload the file.&lt;/P&gt;&lt;P&gt;Traceback (most recent call last):&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "C:\Users\NSR\py-packages\test\test_mlflow.py", line 21, in &amp;lt;module&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;model = exp.get_model(nom="model_rf")&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "C:\Users\NSR\py-packages\ircem\mlflow.py", line 178, in get_model&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;return mlflow.spark.load_model(model_path)&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "D:\venv_python\Python38\lib\site-packages\mlflow\spark.py", line 711, in load_model&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;return _load_model(model_uri=model_uri, dfs_tmpdir_base=dfs_tmpdir)&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "D:\venv_python\Python38\lib\site-packages\mlflow\spark.py", line 659, in _load_model&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;model_uri = _HadoopFileSystem.maybe_copy_from_uri(model_uri, dfs_tmpdir)&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "D:\venv_python\Python38\lib\site-packages\mlflow\spark.py", line 382, in maybe_copy_from_uri&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;return cls.maybe_copy_from_local_file(_download_artifact_from_uri(src_uri), dst_path)&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "D:\venv_python\Python38\lib\site-packages\mlflow\spark.py", line 349, in maybe_copy_from_local_file&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;cls.copy_from_local_file(src, dst, remove_src=False)&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "D:\venv_python\Python38\lib\site-packages\mlflow\spark.py", line 331, in copy_from_local_file&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;cls._fs().copyFromLocalFile(remove_src, cls._local_path(src), cls._remote_path(dst))&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "D:\venv_python\Python38\lib\site-packages\py4j\java_gateway.py", line 1304, in __call__&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;return_value = get_return_value(&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "D:\venv_python\Python38\lib\site-packages\pyspark\sql\utils.py", line 117, in deco&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;return f(*a, **kw)&lt;/P&gt;&lt;P&gt;&amp;nbsp;File "D:\venv_python\Python38\lib\site-packages\py4j\protocol.py", line 326, in get_return_value&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;raise Py4JJavaError(&lt;/P&gt;&lt;P&gt;py4j.protocol.Py4JJavaError: An error occurred while calling o334.copyFromLocalFile.&lt;/P&gt;&lt;P&gt;: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:793)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1215)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.FileUtil.list(FileUtil.java:1420)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:601)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1972)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:2014)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:761)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:406)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:390)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:2482)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:2448)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.lang.reflect.Method.invoke(Unknown Source)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at py4j.Gateway.invoke(Gateway.java:295)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at py4j.commands.CallCommand.execute(CallCommand.java:79)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:195)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at py4j.ClientServerConnection.run(ClientServerConnection.java:115)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;at java.lang.Thread.run(Unknown Source)&lt;/P&gt;</description>
      <pubDate>Fri, 14 Oct 2022 15:37:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28811#M1597</guid>
      <dc:creator>Nath</dc:creator>
      <dc:date>2022-10-14T15:37:30Z</dc:date>
    </item>
    <item>
      <title>Re: Error loading model from mlflow: java.io.StreamCorruptedException: invalid type code: 00</title>
      <link>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28812#M1598</link>
      <description>&lt;P&gt;Hi @Kaniz Fatma​&amp;nbsp;and @Shanmugavel Chandrakasu​,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;It works after putting hadoop.dll into C:\Windows\System32 folder.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have hadoop version 3.3.1.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I already had winutils.exe&amp;nbsp;in the Hadoop bin folder.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Regards&lt;/P&gt;&lt;P&gt;Nath&lt;/P&gt;</description>
      <pubDate>Mon, 17 Oct 2022 11:41:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28812#M1598</guid>
      <dc:creator>NSRBX</dc:creator>
      <dc:date>2022-10-17T11:41:34Z</dc:date>
    </item>
    <item>
      <title>Re: Error loading model from mlflow: java.io.StreamCorruptedException: invalid type code: 00</title>
      <link>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28814#M1600</link>
      <description>&lt;P&gt;So I am having the same issue with Databricks Connect 10.4.22 and I came across this old post. I am using linux though, what would be the equivalent fix here (hadoop.dll is a windows library)&lt;/P&gt;</description>
      <pubDate>Thu, 23 Mar 2023 19:43:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/error-loading-model-from-mlflow-java-io-streamcorruptedexception/m-p/28814#M1600</guid>
      <dc:creator>Benglish11</dc:creator>
      <dc:date>2023-03-23T19:43:40Z</dc:date>
    </item>
  </channel>
</rss>

