spark.sql() is suddenly giving an error "Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient" on databricks jobs and python scripts that worked last month. No local changes on my end.
What could be the cause of this and what should I be looking at?
Traceback (most recent call last): File "<string>", line 1, in <module> File "/home/mbarlett/projects/ds-pipelines/.venv/lib/python3.10/site-packages/pyspark/sql/connect/session.py", line 503, in sql data, properties = self.client.execute_command(cmd.command(self._client)) File "/home/mbarlett/projects/ds-pipelines/.venv/lib/python3.10/site-packages/pyspark/sql/connect/client/core.py", line 892, in execute_command data, _, _, _, properties = self._execute_and_fetch(req) File "/home/mbarlett/projects/ds-pipelines/.venv/lib/python3.10/site-packages/pyspark/sql/connect/client/core.py", line 1172, in _execute_and_fetch for response in self._execute_and_fetch_as_iterator(req): File "/home/mbarlett/projects/ds-pipelines/.venv/lib/python3.10/site-packages/pyspark/sql/connect/client/core.py", line 1153, in _execute_and_fetch_as_iterator self._handle_error(error) File "/home/mbarlett/projects/ds-pipelines/.venv/lib/python3.10/site-packages/pyspark/sql/connect/client/core.py", line 1308, in _handle_error self._handle_rpc_error(error) File "/home/mbarlett/projects/ds-pipelines/.venv/lib/python3.10/site-packages/pyspark/sql/connect/client/core.py", line 1344, in _handle_rpc_error raise convert_exception(info, status.message) from None pyspark.errors.exceptions.connect.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient