cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Jobs in Spark UI

HariPrasad1
New Contributor II

Is there a way to get the url where all the spark jobs which are created in a specific notebook run can be found? I am creating an audit framework, in that the requirement is to get the spark jobs of a specific task or a notebook run so that we can debug the job if there are any error.

3 REPLIES 3

eniwoke
Contributor II

Hi @HariPrasad1, if I understand correctly, is it to programmatically retrieve the Spark UI URL (which has access to the list of jobs) in a notebook when a job is running?

Eni

HariPrasad1
New Contributor II

Yes @eniwoke . You are correct.

eniwoke
Contributor II

Hi @HariPrasad1 here is a way to get the job list (note: works for non-serverless clusters)

from dbruntime.databricks_repl_context import get_context

cluster_id = spark.conf.get("spark.databricks.clusterUsageTags.clusterId")
workspaceUrl = spark.conf.get('spark.databricks.workspaceUrl')
context_id = spark.conf.get('spark.databricks.sparkContextId')

context__ = get_context()
workspaceId = context__.workspaceId

# For non serverless clusters
spark_job_ui_url = f"https://{workspaceUrl}/compute/sparkui/{cluster_id}/driver-{context_id}?o={workspaceId}"
print(spark_job_ui_url)

 kindly tell me if it works for you ๐Ÿ™‚ 

Eni

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now