cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Identify the associated notenook for the application running from the spark UI

Data_Engineer3
Contributor II

In spark UI, I can see the application running with the application ID, from this spark UI, could I able to see the which notebook is running with that applications is this possible?

I am interested in learning more about the jobs, stage how it works internally with spark with the help of spark UI in databricks, could you guide me(share the learning agenda or maerial or online course to check on it etc.. ) the path to learn more Indepth on it.

Note: as a first step I need to understand on how one complete notebook excution is taken care with respect to jobs and stage in the pyspark.  

Thanks.

2 REPLIES 2

Hubert-Dudek
Esteemed Contributor III

https://spark.apache.org/docs/3.1.1/api/python/reference/api/pyspark.SparkContext.setJobDescription....

spark.setJobDescription("my name") will make your life easier. Just put it in the notebook.

You should also put it after each action (show, count, toPandas, write, aggregation etc.)

 

Kaniz
Community Manager
Community Manager

Hey there! Thanks a bunch for being part of our awesome community! 🎉 

We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution for you. And remember, if you ever need more help , we're here for you! 

Keep being awesome! 😊🚀

 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.