I have upgraded the Azure Databricks from 6.4 to 9.1 which enable me to use Spark3. As far as I know, the Hive version has to be upgraded to 2.3.7 as well as discussed in:
https://community.databricks.com/s/question/0D53f00001HKHy2CAH/how-to-upgrade-internal-hive-metadata...
I have tried not given those options and continue using Hive 0.13 to run my Spark3 application and everything seems fine. Is it a must to upgrade the Hive metastore to 2.3.7 for Spark3 program to run? Cause everything seems find without adding:
spark.sql.hive.metastore.version 2.3.7
spark.sql.hive.metastore.jars builtin