Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
@Jackson1111 - If you are talking about workflow jobs, you can try running using a job cluster to generate spark logs for a each of the workflow jobs.
But, If this is of Spark Jobs within the Spark UI, you wanted to separate out the logs. This is a new feature. you can raise it is an idea - https://www.databricks.com/feedback.
Join 100K+ Data Experts: Register Now & Grow with Us!
Excited to expand your horizons with us? Click here to Register and begin your journey to success!
Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!