Hello,
I have a Databricks table with a column using the new GEOMETRY type. When I try to access this table from a Spark workload, I am not able to describe the table or operate on any of its columns. My Spark config is the following, per the Databricks docs:
spark.sql.catalog.spark_catalog org.apache.spark.sql.delta.catalog.DeltaCatalog
spark.sql.catalog.my_catalog io.unitycatalog.spark.UCSingleCatalog
spark.sql.catalog.my_catalog.uri <Databricks workspace URL>/api/2.1/unity-catalog
spark.sql.catalog.my_catalog.token <Databricks PAT>
If I try to access the table from my Spark workload - even to examine its schema or operate on non-geometry columns - I get an error like this:
io.unitycatalog.client.ApiException: com.fasterxml.jackson.databind.exc.ValueInstantiationException: Cannot construct instance of `io.unitycatalog.client.model.ColumnTypeName`, problem: Unexpected value 'GEOMETRY'
at [Source: (jdk.internal.net.http.ResponseSubscribers$HttpResponseInputStream); line: 1, column: 8354] (through reference chain: io.unitycatalog.client.model.TableInfo["columns"]->java.util.ArrayList[13]->io.unitycatalog.client.model.ColumnInfo["type_name"])
at io.unitycatalog.client.api.TablesApi.getTableWithHttpInfo(TablesApi.java:273)
at io.unitycatalog.client.api.TablesApi.getTable(TablesApi.java:241)
at io.unitycatalog.spark.UCProxy.loadTable(UCSingleCatalog.scala:237)
at org.apache.spark.sql.connector.catalog.DelegatingCatalogExtension.loadTable(DelegatingCatalogExtension.java:73)
at org.apache.spark.sql.delta.catalog.DeltaCatalog.super$loadTable(DeltaCatalog.scala:229)
at org.apache.spark.sql.delta.catalog.DeltaCatalog.$anonfun$loadTable$1(DeltaCatalog.scala:229)
at org.apache.spark.sql.delta.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:171)
at org.apache.spark.sql.delta.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:169)
at org.apache.spark.sql.delta.catalog.DeltaCatalog.recordFrameProfile(DeltaCatalog.scala:67)
at org.apache.spark.sql.delta.catalog.DeltaCatalog.loadTable(DeltaCatalog.scala:228)
at io.unitycatalog.spark.UCSingleCatalog.loadTable(UCSingleCatalog.scala:73)
at org.apache.spark.sql.connector.catalog.CatalogV2Util$.getTable(CatalogV2Util.scala:363)
at org.apache.spark.sql.connector.catalog.CatalogV2Util$.loadTable(CatalogV2Util.scala:337)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.$anonfun$lookupTableOrView$2(Analyzer.scala:1228)
[...]