The behavior you're observing—where the local property for spark.scheduler.pool
is being set to a dynamic integer value rather than mirroring the global configuration—is not the default behavior of Spark or Databricks. Normally, global Spark configurations (e.g., set at the cluster level) should propagate to individual sessions unless explicitly overridden.
Is it possible there is a local overide you are not aware of? Everything you are reporting points to something in your environment that is programmaticaly overring your global setting. Local settings will override global settings, it is the pecking order.
Test for local settings: print(spark.sparkContext.getLocalProperty("spark.scheduler.pool"))
Test for global settings: print(spark.conf.get("spark.scheduler.pool"))
You may also want to look for any shared libraries, init scripts, or notebook templates that might include calls to "setLocalProperty"
You can also look at the cluster logs for any evidence of dynamic property assignments.
Cheers, Louis.