We are trying to read a column which is enum of array datatype from postgres as string datatype to target. We could able to achieve this by expilcitly using concat function while extracting like below
val jdbcDF3 = spark.read
.format("jdbc")
.option("url", <jdbc url>)
.option("query", "SELECT concat(colname) as colname FROM <tablename> ")
.load()
Does spark not support reading array datatype as string by default without using concat function?
We are getting the "SQLException: Unsupported type ARRAY" error if we dont use concat function while extracting