User16763506477
Databricks Employee
Databricks Employee

Hi @Franklin George​  , As mentioned on stackoverflow also, jobIdToStageIds mapping is store in spark context (DagScheduler) . So I don't think it is possible to get this info at the executor level while the task is running.

May I know what you want to do with jobId at the task level? What is the use case here?