Hi @Vamsee krishna kanth Arcot Yes, currently you will have to download the JDBC from https://databricks.com/spark/jdbc-drivers-download and connect from other applications with JDBC URL just like you mentioned in your example. There is an internal ...
The job clusters for finished or failed runs are maintained in Job Clusters UI. They are up to 30 recently terminated job clusters are retained in UI and others are terminated. The finished or canceled runs are also cleaned up automatically starting...