12-16-2021 01:55 PM
I have upgraded the Azure Databricks from 6.4 to 9.1 which enable me to use Spark3. As far as I know, the Hive version has to be upgraded to 2.3.7 as well as discussed in:
I have tried not given those options and continue using Hive 0.13 to run my Spark3 application and everything seems fine. Is it a must to upgrade the Hive metastore to 2.3.7 for Spark3 program to run? Cause everything seems find without adding:
spark.sql.hive.metastore.version 2.3.7
spark.sql.hive.metastore.jars builtin
02-12-2022 08:30 AM
@Jeffrey Mak the doc is not updated yet , may be team is working on it . you can consider 7.x+ for 9 DBR. Please let us know how that goes. Thanks.
01-11-2022 06:55 PM
@Jeffrey Mak all supported version is mention here https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore also, I do not think builtin will work.
01-26-2022 08:19 AM
@Jeffrey Mak - Does Atanu's answer resolve the issue for you? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?
01-26-2022 11:09 AM
I'm asking about Datatricks version 9.1. I've follow the url given (https://docs.microsoft.com/en-us/azure/databricks/data/metastores/external-hive-metastore). Do you mind letting me know where in the table is mentioning the supported hive version for Databricks 9.1?
02-12-2022 08:30 AM
@Jeffrey Mak the doc is not updated yet , may be team is working on it . you can consider 7.x+ for 9 DBR. Please let us know how that goes. Thanks.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group