- 10131 Views
- 8 replies
- 1 kudos
Hello everyone ! I am trying to pass a Typesafe config file to the spark submit task and print the details in the config file. Code: import org.slf4j.{Logger, LoggerFactory}
import com.typesafe.config.{Config, ConfigFactory}
import org.apache.spa...
- 10131 Views
- 8 replies
- 1 kudos
Latest Reply
I've experenced similar issues; please help to answer how to get this working;I've tried using below to be either /dbfs/mnt/blah path or dbfs:/mnt/blah pathin either spark_submit_task or spark_jar_task (via cluster spark_conf for java optinos); no su...
7 More Replies
- 1987 Views
- 0 replies
- 0 kudos
How Can I pass parameters from the data factory to databricks Jobs that is using a notebook but I know how to pass parameters from data factory to databricks notebooks when ADF calling directly the Notebook.
- 1987 Views
- 0 replies
- 0 kudos
- 2390 Views
- 4 replies
- 9 kudos
Question - When you set a reoccuring job to simply update a notebook, does databricks clear the state of the notebook prior to executing the notebook? If not, can I configure it to make sure it clears the state before running?
- 2390 Views
- 4 replies
- 9 kudos
Latest Reply
@Paras Patel - Would you be happy to mark Hubert's answer as best so that other members can find the solution more easily?Thanks!
3 More Replies
- 2054 Views
- 2 replies
- 4 kudos
Hi community!I would like to know if it is possible to start a Multi-task Job Run from and specific task. The use case is as follows:I have a 17 tasks JobA task in the middle, let's say a task after 2 dependencies, failsI found the error and now it i...
- 2054 Views
- 2 replies
- 4 kudos
Latest Reply
+1 to what @Dan Zafar said. We're working **** ** this. Looking forward to bring this to you in the near future.
1 More Replies