08-10-2021 07:12 AM
Is it possible to upgrade the out of the box hive metastore version?
running spark.conf.get("spark.sql.hive.metastore.version") indicates that it is running on 0.13.0
However https://docs.microsoft.com/en-us/azure/databricks/release-notes/runtime/7.x-migration#apache-hive seems to indicate that the version was upgraded to 2.3
I have attempted to add the following to the spark config
spark.sql.hive.metastore.version 2.3.7
spark.sql.hive.metastore.jars builtin
But it results in errors whose stacktrace seems to indicate its trying to connect to an external metastore. Im not interested in setting up an external metastore at this time
What should the hive metastore version be and is there anything I need to do to upgrade it?
01-11-2022 07:36 PM
can you try changing the same as per
Hive 2.3.7 (Databricks Runtime 7.0 and above): set spark.sql.hive.metastore.jars to builtin.
For all other Hive versions, Azure Databricks recommends that you download the metastore JARs and set the configuration spark.sql.hive.metastore.jars to point to the downloaded JARs using the procedure described in Download the metastore jars and point to them.
11-22-2021 09:48 PM
Only way I found to solve this is to use external metastore so I am using azure sql db hosted hive metastore . It creates dbo.version table with a row showing version of metastore. Update this row to match the value with spark.sql.hive.metastore.version value
12-16-2021 01:30 PM
I have exactly the same issue. My question is, is it a must to use 2.3.7? or it is still ok to use 0.13.0 in Spark3?
12-22-2021 08:34 AM
That does not help. The suggestion to by-pass the version validation only delays the issue. When i tried to create new tables, it complains that some columns is found missing in the hive metastore (0.13) .
01-11-2022 07:36 PM
can you try changing the same as per
Hive 2.3.7 (Databricks Runtime 7.0 and above): set spark.sql.hive.metastore.jars to builtin.
For all other Hive versions, Azure Databricks recommends that you download the metastore JARs and set the configuration spark.sql.hive.metastore.jars to point to the downloaded JARs using the procedure described in Download the metastore jars and point to them.
01-27-2022 02:41 AM
That doesn't seem to solve the problem.
We appear to be using hive 0.13.0, docs mention we should be be on 2.3.7. Is there something we have to do on our end to upgrade?
Running the queries gives
spark.conf.get("spark.sql.hive.metastore.jars") //builtin
spark.conf.get("spark.sql.hive.metastore.version") //0.13.0
setting spark.sql.hive.metastore.jars to builtin does not change the metastore version
How do we upgrade our builtin metastore?
01-26-2022 08:18 AM
@Alex Davies - Does Atanu's information help resolve the issue? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?
07-25-2022 05:09 AM
Hello guys!
Atanu's post, although correct does not solve the problem. Is there any official documentation on how to upgrade the internal databricks metastore to a greater version? If this is availble then we can try Atanu's solution (not sure if needed in that case)
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group