DBR 11.3 LTS - Specify spark.app.id for python notebook task
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-05-2023 03:24 AM
Hi, I use Terraform databricks_job to submit a job with many tasks. IFAIK, each task is a separated Spark Application with a specific applicationId (https://spark.apache.org/docs/3.1.3/api/python/reference/api/pyspark.SparkContext.applicationId.html). These tasks are working fine except they have the same applicationId because Databricks uses app_yyyyMMddHHmmss as application id and all these apps are submitted at the same time by Terraform, hence they all have the same application. Does Databricks support specifying `spark.app.id` for python notebook task.
0 REPLIES 0

