cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Scheduling jobs with Airflow result in each task running multiple jobs.

Tacuma
New Contributor II

Hey everyone, I'm experiementing with running containerized pyspark jobs in Databricks, and orchestrating them with airflow. I am however, encountering an issue here. When I trigger an airflow DAG, and I look at the logs, I see that airflow is spinning up multiple jobs in its first try.

It's really strange. Attached a screenshot - in this instance, I need the job to run only once, but here 3 jobs are being run.

Wanted to know if this has been encountered before and any fixes, and will send cluster and pool config details upon request.

4 REPLIES 4

daniel_sahal
Honored Contributor III

That's weird behaviour. Can you please share the sample of an Airflow code?

Tacuma
New Contributor II

Thanks Daniel. Sure thing.

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi @Tacuma Solomon​ , 3 jobs with the same config? or 3 job runs?

Tacuma
New Contributor II

Both, I guess? Yes, all jobs share the same config - the question I have is why in the same airflow task log, there are 3 jobs runs. I'm hoping that there's something in the configs and may give me some kind of clue.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.