Hi,
I have set up a job of multiple spark python tasks running in parallel. I have only set up one job cluster, single node, data security mode SINGLE_USER, using Databricks Runtime version 14.3.x-scala2.12.
These parallel spark python tasks share some similar variable names, but they are not technically global variables, everything is defined under one main function per file.
Will the python tasks somehow share these variables since I am using the same clusters? Can this ever happen using Databricks cluster?