Hello @Retired_mod ,
Thanks for your help. Sending logs at real-time with workspace URL sounds good but it can add delay in your jobs execution since requests will be triggered sequentially.
We are looking something like spark-monitoring for GCP databricks, but spark-monitoring library is specially built for Azure databricks, will it work like this on GCP databricks?
GCP databricks is build different than Azure databricks and hence internal paths are different also.
I tried using it for GCP databricks, but since there is difference in resource hierarchy between gcp and azure I am unsure if this can be used.
Also, we are using 14.3 LTS version and for now I am getting class not found exceptions while using the jar for scala classes.