I am trying to get the spark default metrics from the application to statsd sink at Job level not cluster level. So I configured the necessary configuration in the Spark context and spark session in code. And in a local system, which means a single node, while it is running, I am able to receive all the UDP packets. While in the AWS DataBricks (SingleNode or MultiNode) cluster, I deployed the same jar. But there, I am not getting any metrics. But if I override the Spark config in Advance Options -> Compute in Databricks. I am able to receive metrics. And configurations are reflected in the Environment Variables. Below i have attached the job level configuration code.
Databricks spark cluster config
I tried updating "metrics.properties" using InitScript in the databricks cluster.I am able to receive metrics
. And also overrode the spark session configuration again after the spark session creation. And I am printing the spark configuration values in the console.