Hi everyone,
I am using DBR version 13 and Managed tables in a custom catalog location of table is AWS S3.
running notebook on single user cluster
I am facing MalformedInputException while saving data to Tables or reading it.
When I am running my notebook first time and tables are empty it works fine. Data is saved to tables. But when I try again immediately to run notebook, I am not able to save data and getting Exception in Subject.
However, if run notebook again next day or if delete everything from table it works fine.
I am using df.write.mode('overwrite').saveAsTable('tablename') and for reading delta table DeltaTable.forName().
Error:
Py4JJavaError: An error occurred while calling z:io.delta.tables.DeltaTable.forName. : /or os4.saveAsTable() java.nio.charset.MalformedInputException: Input length = 1 at java.nio.charset.CoderResult.throwException(CoderResult.java:281) at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:339) at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178) at java.io.InputStreamReader.read(InputStreamReader.java:184) at java.io.BufferedReader.read1(BufferedReader.java:210) at java.io.BufferedReader.read(BufferedReader.java:286) at java.io.Reader.read(Reader.java:140) at scala.io.BufferedSource.mkString(BufferedSource.scala:98) at com.databricks.common.client.RawDBHttpClient.getResponseBody(DBHttpClient.scala:1229) at com.databricks.common.client.RawDBHttpClient.$anonfun$httpRequestInternal$1(DBHttpClient.scala:1191) at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:571)
Thanks for any suggestion in Advance.