I am trying to get the spark default metrics from the application to statsd sink at Job level not cluster level. So I configured the necessary configuration in the Spark context and spark session in code. And in a local system, which means a single node, while it is running, I am able to receive all the UDP packets. While in the AWS DataBricks (SingleNode or MultiNode) cluster, I deployed the same jar. But there, I am not getting any metrics. But if I override the Spark config in Advance Options -> Compute in Databricks. I am able to receive metrics. And configurations are reflected in the Environment Variables. Below i have attached the job level configuration code.
![dhinesh_balaji_3-1688629795475.png dhinesh_balaji_3-1688629795475.png](/t5/image/serverpage/image-id/2777iCFCDBB672387E0DD/image-size/large/is-moderation-mode/true?v=v2&px=999)
Databricks spark cluster config
![image.png image.png](/t5/image/serverpage/image-id/2775iD4C067E9E4A96DA9/image-size/large/is-moderation-mode/true?v=v2&px=999)
I tried updating "metrics.properties" using InitScript in the databricks cluster.I am able to receive metrics
![dhinesh_balaji_6-1688630322016.png dhinesh_balaji_6-1688630322016.png](/t5/image/serverpage/image-id/2780i6159303356B31BAC/image-size/medium/is-moderation-mode/true?v=v2&px=400)
. And also overrode the spark session configuration again after the spark session creation. And I am printing the spark configuration values in the console.
![image.png image.png](/t5/image/serverpage/image-id/2776i400D0774A86A31CC/image-size/large/is-moderation-mode/true?v=v2&px=999)