Hi @HariPrasad1 here is a way to get the job list (note: works for non-serverless clusters)
from dbruntime.databricks_repl_context import get_context
cluster_id = spark.conf.get("spark.databricks.clusterUsageTags.clusterId")
workspaceUrl = spark.conf.get('spark.databricks.workspaceUrl')
context_id = spark.conf.get('spark.databricks.sparkContextId')
context__ = get_context()
workspaceId = context__.workspaceId
# For non serverless clusters
spark_job_ui_url = f"https://{workspaceUrl}/compute/sparkui/{cluster_id}/driver-{context_id}?o={workspaceId}"
print(spark_job_ui_url)
kindly tell me if it works for you ๐
Eni