cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How many active connections are made to Hive metastore

User16869510359
Esteemed Contributor

We are using an internal metastore implementation. ie the metastore is hosted at the Dataricks side. However, we believe the metastore instance made available for my workspace is not adequate enough to handle the load. How can I monitor the number of connections made from the clusters to the Hive metatore. 

1 ACCEPTED SOLUTION

Accepted Solutions

User16869510359
Esteemed Contributor

Use the below code snippet from a notebook

%scala 
import java.sql.Connection
import java.sql.DriverManager
import java.sql.ResultSet
import java.sql.SQLException
 
/**
  * For details on what this query means, checkout https://dev.mysql.com/doc/refman/8.0/en/processlist-table.html
**/
 
def printConnections: Unit = {
  val metastoreURL = spark.sparkContext.hadoopConfiguration.get("javax.jdo.option.ConnectionURL")
  val metastoreUser = spark.sparkContext.hadoopConfiguration.get("javax.jdo.option.ConnectionUserName")
  val metastorePassword = spark.sparkContext.hadoopConfiguration.get("javax.jdo.option.ConnectionPassword")
 
  val connection = DriverManager.getConnection(metastoreURL, metastoreUser, metastorePassword)
  val statement = connection.createStatement()
  val resultSet = statement.executeQuery("SELECT * FROM INFORMATION_SCHEMA.PROCESSLIST ORDER BY Host")
 
  val rsmd = resultSet.getMetaData();
  val columnsNumber = rsmd.getColumnCount();
  (1 to columnsNumber).foreach { i =>
    print(rsmd.getColumnName(i) + "\t\t\t\t\t\t\t")
  }
  println();
  while (resultSet.next()) {
      var cumulativeLength = 0
      (1 to columnsNumber).foreach { i =>
          val data = if (resultSet.getString(i) != null) resultSet.getString(i).trim() else ""
          print(data + "\t\t\t\t\t\t");
      }
      println();
  }
  statement.close
  connection.close
}
 
printConnections

View solution in original post

1 REPLY 1

User16869510359
Esteemed Contributor

Use the below code snippet from a notebook

%scala 
import java.sql.Connection
import java.sql.DriverManager
import java.sql.ResultSet
import java.sql.SQLException
 
/**
  * For details on what this query means, checkout https://dev.mysql.com/doc/refman/8.0/en/processlist-table.html
**/
 
def printConnections: Unit = {
  val metastoreURL = spark.sparkContext.hadoopConfiguration.get("javax.jdo.option.ConnectionURL")
  val metastoreUser = spark.sparkContext.hadoopConfiguration.get("javax.jdo.option.ConnectionUserName")
  val metastorePassword = spark.sparkContext.hadoopConfiguration.get("javax.jdo.option.ConnectionPassword")
 
  val connection = DriverManager.getConnection(metastoreURL, metastoreUser, metastorePassword)
  val statement = connection.createStatement()
  val resultSet = statement.executeQuery("SELECT * FROM INFORMATION_SCHEMA.PROCESSLIST ORDER BY Host")
 
  val rsmd = resultSet.getMetaData();
  val columnsNumber = rsmd.getColumnCount();
  (1 to columnsNumber).foreach { i =>
    print(rsmd.getColumnName(i) + "\t\t\t\t\t\t\t")
  }
  println();
  while (resultSet.next()) {
      var cumulativeLength = 0
      (1 to columnsNumber).foreach { i =>
          val data = if (resultSet.getString(i) != null) resultSet.getString(i).trim() else ""
          print(data + "\t\t\t\t\t\t");
      }
      println();
  }
  statement.close
  connection.close
}
 
printConnections

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.