Hi @balwantsingh24
Internal Metastore:-
Internal metastores are managed by Databricks and are typically used to store metadata about databases, tables, views, and user-defined functions (UDFs). This metadata is essential for operations like the SHOW TABLES SQL command and the Tables UI in Databricks. The internal metastore is part of the Apache Hive project and is hosted by Databricks for their customers.
External Metastore:-
External metastores, on the other hand, are customer-hosted and can be configured to connect to Databricks clusters. This setup is often used to share metadata between Databricks and other systems like Hive, Glue Presto etc; Customers explicitly configure this for their requirements or use-case.
To check the driver logs, you can follow the below link,
https://docs.databricks.com/en/compute/clusters-manage.html#compute-driver-and-worker-logs
Based on the error logs showing in the driver logs, we will understand what could be the cause of this error.
The error "Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetastoreClient" typically occurs when the Spark driver is unable to initiate a Hive client connection to the metastore. Is the 3306 port allowed on the Databricks VPC as per the docs?