12-16-2021 01:55 PM
I have upgraded the Azure Databricks from 6.4 to 9.1 which enable me to use Spark3. As far as I know, the Hive version has to be upgraded to 2.3.7 as well as discussed in:
I have tried not given those options and continue using Hive 0.13 to run my Spark3 application and everything seems fine. Is it a must to upgrade the Hive metastore to 2.3.7 for Spark3 program to run? Cause everything seems find without adding:
spark.sql.hive.metastore.version 2.3.7
spark.sql.hive.metastore.jars builtin
02-12-2022 08:30 AM
@Jeffrey Mak the doc is not updated yet , may be team is working on it . you can consider 7.x+ for 9 DBR. Please let us know how that goes. Thanks.
01-11-2022 06:55 PM
@Jeffrey Mak all supported version is mention here https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore also, I do not think builtin will work.
01-26-2022 08:19 AM
@Jeffrey Mak - Does Atanu's answer resolve the issue for you? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?
01-26-2022 11:09 AM
I'm asking about Datatricks version 9.1. I've follow the url given (https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore). Do you mind letting me know where in the table is mentioning the supported hive version for Databricks 9.1?
02-12-2022 08:30 AM
@Jeffrey Mak the doc is not updated yet , may be team is working on it . you can consider 7.x+ for 9 DBR. Please let us know how that goes. Thanks.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now