- 10207 Views
- 8 replies
- 1 kudos
Hello everyone ! I am trying to pass a Typesafe config file to the spark submit task and print the details in the config file. Code: import org.slf4j.{Logger, LoggerFactory}
import com.typesafe.config.{Config, ConfigFactory}
import org.apache.spa...
- 10207 Views
- 8 replies
- 1 kudos
Latest Reply
I've experenced similar issues; please help to answer how to get this working;I've tried using below to be either /dbfs/mnt/blah path or dbfs:/mnt/blah pathin either spark_submit_task or spark_jar_task (via cluster spark_conf for java optinos); no su...
7 More Replies
- 3620 Views
- 2 replies
- 1 kudos
Hi! I currently have this as an old generic template with amends over time to optimize Databricks Spark execution, can you help me to know if this still makes sense for v10-11-12 or if there are new recommendations? Maybe some of this is making my pr...
- 3620 Views
- 2 replies
- 1 kudos
Latest Reply
@Alejandro Martinez :Hi! Your template seems to be a good starting point for configuring a SparkSession in Databricks. However, there are some new recommendations that you can consider for Databricks runtime versions v10-11-12. Here are some suggest...
1 More Replies
- 3877 Views
- 4 replies
- 4 kudos
Hi! I'm starting to test configs on DataBricks, for example, to avoid corrupting data if two processes try to write at the same time:.config('spark.databricks.delta.multiClusterWrites.enabled', 'false')Or if I need more partitions than default .confi...
- 3877 Views
- 4 replies
- 4 kudos
Latest Reply
Hey there @Alejandro Martinez Hope everything is going well.Just wanted to see if you were able to find an answer to your question. If yes, would you be happy to let us know and mark it as best so that other members can find the solution more quickl...
3 More Replies
- 4753 Views
- 5 replies
- 11 kudos
Hi, I'm running couple of Notebooks in my pipeline and I would like to set fixed value of 'spark.sql.shuffle.partitions' - same value for every notebook. Should I do that by adding spark.conf.set.. code in each Notebook (Runtime SQL configurations ar...
- 4753 Views
- 5 replies
- 11 kudos
Latest Reply
Hi, Thank you all for the tips. I tried before to set this option in Spark Config but didn't work for some reason. Today I tried again and it's working :).
4 More Replies
- 1977 Views
- 0 replies
- 0 kudos
Yes, you can use the widgets api to have some control to validate the input before you pass the values to the rest of your codeFor example:folder = dbutils.widgets.get("Folder")
if folder == "":
raise Exception("Folder missing")or to get spark se...
- 1977 Views
- 0 replies
- 0 kudos