02-16-2022 08:34 PM
In Spark we can get the Spark Application ID inside the Task programmatically using:
SparkEnv.get.blockManager.conf.getAppId
and we can get the Stage ID and Task Attempt ID of the running Task using:
TaskContext.get.stageId
TaskContext.get.taskAttemptId
Is there any way to get the Spark Job Id that is associated with a running Task (preferably using TaskContext or SparkEnv)?
Linked Question on StackOverflow: https://stackoverflow.com/questions/70929032/how-to-programmatically-get-the-spark-job-id-of-a-runni...
05-04-2022 09:24 AM
@Franklin George , Honestly, there is no easy way to do this. Your only option is to set up cluster log delivery, which will give you access to the cluster's event log file. This event log file is JSON and contains all of the info that the SparkUI uses (and more). It will have the information you are looking for but is not trivial to parse manually. I can't think of a better option.
02-17-2022 02:49 AM
Hi @Franklin George ! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.
02-23-2022 12:01 AM
Hi @Franklin George ,
It depends on which language you are using.
Scala
https://spark.apache.org/docs/1.6.1/api/scala/index.html#org.apache.spark.SparkContext
sc.applicationId
Java
https://spark.apache.org/docs/1.6.2/api/java/org/apache/spark/api/java/JavaSparkContext.html
sparkContext.sc().applicationId();
Python
http://spark.apache.org/docs/1.6.2/api/python/pyspark.html#pyspark.SparkContext
sc.applicationId
It can also depend on the Spark version.
03-08-2022 05:34 PM
Hi @Franklin George , As mentioned on stackoverflow also, jobIdToStageIds mapping is store in spark context (DagScheduler) . So I don't think it is possible to get this info at the executor level while the task is running.
May I know what you want to do with jobId at the task level? What is the use case here?
03-08-2022 08:21 PM
Hi @Gaurav Rupnar , I have Spark SQL UDFs (implemented as Scala methods) in which I want to get the details of the Spark SQL query that called the UDF, especially a unique query ID, which in SparkSQL is the Spark Job ID. That's why I wanted a way to detect the Job ID from the UDF code itself when it is executed on the Executors as Tasks.
A logic in my UDF requires this unique query id (Job ID) to enforce that the UDF execution(s) will be consistent for each SparkSQL query.
05-04-2022 09:24 AM
@Franklin George , Honestly, there is no easy way to do this. Your only option is to set up cluster log delivery, which will give you access to the cluster's event log file. This event log file is JSON and contains all of the info that the SparkUI uses (and more). It will have the information you are looking for but is not trivial to parse manually. I can't think of a better option.
05-17-2022 10:28 AM
Hi @Franklin George,
Just a friendly follow-up. Do you still need hep or any other responses provided help you to resolve your issue? Please et us know.
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.