Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
I'm creating a new job in databricks using the databricks-cli:databricks jobs create --json-file ./deploy/databricks/config/job.config.jsonWith the following json:{
"name": "Job Name",
"new_cluster": {
"spark_version": "4.1.x-scala2.1...
This is an old post but still relevant for future readers, so will answer how it is done. You need to add base_parameters flag in the notebook_task config, like the following.
"notebook_task": {
"notebook_path": "...",
"base_parameters": {
...
spark_conf needs to be set prior to the start of the cluster or have to restart the existing cluster. Hence, the spark_conf tag is available only on the job_cluster. you may have to set the configs manually on the interactive cluster prior to using ...
you may try to get the job details from our job api https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsGet and get the response to duplicate it.,
This morning I encountered an issue when trying to create a new job using the Workflows UI (in browser). Never had this issue before.The error message that appears is:"You are not entitled to run this type of task, please contact your Databricks admi...
@Kaniz Fatma @Philip Nord, thanks!I was able to do what I needed by cloning an existing job & modifying. It's fine as a temporary fix for now.Thanks again for the response-- good to know you're aware of it & this isn't anything on my end.