Hi, I would like to understand Databricks JAR based workflow tasks. Can I interpret JAR based runs to be something like a spark-submit on a cluster? In the logs, I was expecting to see the
spark-submit --class com.xyz --num-executors 4
etc., And, then there is another task type introduced recently I think โ spark submit task type. If we have a JAR to execute, does spark submit task type and JAR type are implicitly working similar or different?