cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

[Databricks][JDBC](10400) Invalid type for data - column: 10, type: Array

PraveenC
New Contributor II

Getting below error while mapping an Array Column to String[] entity. Please suggest if

Databricks JDBC support entity mapping of Array Values

[Worked the same code for below config - H2 DB version - 2.1.214 and org.hibernate.dialect.H2Dialect - For JUNIT purpose]

Spring + Hibernate

lib versions >

databricks-jdbc.version>2.6.32

hibernate-core.version>5.6.11.Final

hypersistence-utils-hibernate-55

dialect: org.hibernate.dialect.DerbyDialect

@TypeDefs({@TypeDef(name = "string-array", typeClass = StringArrayType.class)})

@Entity

@Table(name = "table-name")

public class TableName extends BaseEntity {

@Type(type = "string-array")

@Column(name = "COL_NAME", columnDefinition = "VARCHAR(100) ARRAY")

private String[] colName;

.....

}

ERROR o.h.e.jdbc.spi.SqlExceptionHelper - [Databricks][JDBC](10400) Invalid type for data - column: 10, type: Array.

 javax.persistence.PersistenceException: org.hibernate.exception.DataException: Could not read entity state from ResultSet : EntityKey[<<class>>]+ at org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:154)+ at org.hibernate.internal.SessionImpl.find(SessionImpl.java:3435)+ at org.hibernate.internal.SessionImpl.find(SessionImpl.java:3362)+

at org.hibernate.internal.SessionImpl$IdentifierLoadAccessImpl.perform(SessionImpl.java:2768)+ at org.hibernate.internal.SessionImpl$IdentifierLoadAccessImpl.load(SessionImpl.java:2812)+ at org.hibernate.internal.SessionImpl.find(SessionImpl.java:3400)+ ... 110 common frames omitted+Caused by: java.sql.SQLDataException: [Databricks][JDBC](10400) Invalid type for data - column: 10, type: Array.+ at com.databricks.client.exceptions.ExceptionConverter.toSQLException(Unknown Source)+ at com.databricks.client.jdbc.common.SForwardResultSet.getArray(Unknown Source)+ at com.databricks.client.jdbc.common.BaseForwardResultSet.getArray(Unknown Source)+ at io.hypersistence.utils.hibernate.type.array.internal.ArraySqlTypeDescriptor$2.doExtract(ArraySqlTypeDescriptor.java:55)+ at org.hibernate.type.descriptor.sql.BasicExtractor.extract(BasicExtractor.java:47)+ at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:257)+ at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:253)+ at org.hibernate.type.AbstractStandardBasicType.nullSafeGet(AbstractStandardBasicType.java:243)+ at org.hibernate.type.AbstractStandardBasicType.hydrate(AbstractStandardBasicType.java:329)+ at org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:3214)+ at org.hibernate.persister.entity.Loadable.hydrate(Loadable.java:94)+ at org.hibernate.loader.plan.exec.process.internal.EntityReferenceInitializerImpl.loadFromResultSet(EntityReferenceInitializerImpl.java:342)+

4 REPLIES 4

Debayan
Databricks Employee
Databricks Employee

Anonymous
Not applicable

Hi, @Praveen C​ 

​How do you solved your error?

I got same problem.

emmanueltrindad
New Contributor II

Hi, I have the same problem in my project, how did you manage to solve this problem?

I'm using Java + Hibernate + H2 Dialect

Atanu
Databricks Employee
Databricks Employee

Hello @Emmanuel Trindade​  @Praveen C​  This does not look like coming from Databricks end. Look at the error thread.

javax.persistence.PersistenceException: org.hibernate.exception.DataException: Could not read entity state from ResultSet : EntityKey[<<class>>]+ at org.hibernate.internal.ExceptionConverterImpl.convert(ExceptionConverterImpl.java:154)+ at org.hibernate.internal.SessionImpl.find(SessionImpl.java:3435)+ at org.hibernate.internal.SessionImpl.find(SessionImpl.java:3362)+

It looks like from Hibernate, a data issue (may be in schema).

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now