I have a problem with upgrading table to Unity Catalog, I got the error:
summary:
Error in SQL statement:
AnalysisException:
org.apache.hadoop.hive.ql.metadata.HiveException:
Unable to fetch table calexception. Exception thrown when executing query :
SELECT DISTINCT 'org.apache.hadoop.hive.metastore.model.MTable' AS NUCLEUS_TYPE,A0.CREATE_TIME,A0.LAST_ACCESS_TIME,A0.OWNER,A0.RETENTION,A0.IS_REWRITE_ENABLED,A0.TBL_NAME,A0.TBL_TYPE,A0.TBL_ID FROM TBLS A0 LEFT OUTER JOIN DBS B0 ON A0.DB_ID = B0.DB_ID WHERE A0.TBL_NAME = ? AND B0.`NAME` = ?; UpgradeTableCommand `ps_dev`.`dd_omp`.`calexception`, `hive_metastore`.`dd_omp`.`calexception`, false, true, false, false, false , data: com.databricks.backend.common.rpc.SparkDriverExceptions$SQLExecutionException:
Did anybody have ever similar problem with upgrading external tables to Unity Catalog? I couldn't find a single clue about this issue. Full error message is attached.
This is my spark config in Cluster:
spark.sql.hive.metastore.version 1.2.1
hive.metastore.schema.verification.record.version false
spark.databricks.service.port 8787
spark.databricks.driver.enableUserContextForPythonAndRCommands true
spark.sql.hive.metastore.jars /dbfs/databricks/hive_metastore_jars/*
hive.metastore.schema.verification false
spark.databricks.delta.preview.enabled true
spark.databricks.service.server.enabled true