Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
@Jackson1111 - If you are talking about workflow jobs, you can try running using a job cluster to generate spark logs for a each of the workflow jobs.
But, If this is of Spark Jobs within the Spark UI, you wanted to separate out the logs. This is a new feature. you can raise it is an idea - https://www.databricks.com/feedback.
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!