- 1186 Views
- 2 replies
- 2 kudos
Hi, I currently am running a number of notebook jobs from Azure Data Factory. A new requirement has come up where I need to capture a return code in ADF that has been generated from the note. I tried using dbutils.notebook.exit(json.dumps({"return_v...
- 1186 Views
- 2 replies
- 2 kudos
Latest Reply
I tried dbutils.notebook.exit(<value>) where value is 0 and 1. I also tried sys.exit(<value>). None of this produces a difference in the output of the databricks CLI commanddatabricks jobs run-now --json '{"job_id": <JOB_ID>, "job_parameters": {"will...
1 More Replies
- 4783 Views
- 5 replies
- 2 kudos
If I create a job from the web UI and I select Python wheel, I can add kwargs parameters. Judging from the generated JSON job description, they appear under a section named `namedParameters`.However, if I use the REST APIs to create a job, it appears...
- 4783 Views
- 5 replies
- 2 kudos
Latest Reply
@GabrieleMuciacc , in case of serverless compute job this can be pass as external dependency you can't use libraries. "tasks": [{ "task_key": task_id, "spark_python_task": { "python_file": py_file, ...
4 More Replies
- 4361 Views
- 6 replies
- 2 kudos
If I were to stop a rather large job run, say half way thru execution, will any actions performed on our Delta tables persist or will they be rolled back?Are there any other risks that I need to be aware of in terms of cancelling a job run half way t...
- 4361 Views
- 6 replies
- 2 kudos
Latest Reply
Hi, is there any way to ensure transaction control in delta protocol in 2024 across tables for failing jobs?
5 More Replies
- 29114 Views
- 5 replies
- 2 kudos
I am trying to create a JAR for a Azure Databricks job but some code that works when using the notebook interface does not work when calling the library through a job. The weird part is that the job will complete the first run successfully but on an...
- 29114 Views
- 5 replies
- 2 kudos
Latest Reply
I am facing similar issue when trying to use from_utc_timestamp function. I am able to call the function from databricks notebook but when I use the same function inside my java jar and running as a job in databricks, it is giving below error. Analys...
4 More Replies
by
Pritam
• New Contributor II
- 4509 Views
- 4 replies
- 1 kudos
I am not able to create jobs via jobs API in databricks.Error=INVALID_PARAMETER_VALUE: Job settings must be specified.I simply copied the JSON file and saved it. Loaded the same JSON file and tried to create the job via API but the got the above erro...
- 4509 Views
- 4 replies
- 1 kudos
Latest Reply
rAlex
New Contributor III
@Pritam Arya I had the same problem today. In order to use the JSON that you can get from the GUI in an existing job, in a request to the Jobs API, you want to use just the JSON that is the value of the settings key.
3 More Replies
by
j_al
• New Contributor II
- 6910 Views
- 10 replies
- 5 kudos
Jobs API 2.1 OpenAPI specification seems broken.The swagger file seems to be invalid.https://docs.databricks.com/_extras/api-refs/jobs-2.1-aws.yaml
- 6910 Views
- 10 replies
- 5 kudos
Latest Reply
@Debayan Mukherjee , are you suggesting to revert the openapi version specified in https://docs.databricks.com/_extras/api-refs/jobs-2.1-aws.yaml from 3.1.0 to 3.0.3?
9 More Replies
- 2010 Views
- 3 replies
- 0 kudos
I have a job with multiple tasks like Task1 -> Task2 -> Task3. I am trying to call the job using api "run now". Task details are belowTask1 - It executes a Note Book with some input parametersTask2 - It runs using "ABC.jar", so its a jar based task ...
- 2010 Views
- 3 replies
- 0 kudos
Latest Reply
Hi,It would be a good feature to pass parameters at task level. We have scenarios where we would like to create a job with multiple tasks (notebook/dbt) and pass parameters at task level.
2 More Replies
- 6445 Views
- 5 replies
- 1 kudos
I want to know what happen with my cluster and if I can recover it.I entered to my Databricks account and I didn't found my jobs and my cluster. I couldn't find any log of the deleted cluster because the log is into the cluster interface. I entered t...
- 6445 Views
- 5 replies
- 1 kudos
Latest Reply
Dear folks,When the tables has been deleted, then why I am unable to create the table with same name.It continiously giving me error"DeltaAnalysisException: Cannot create table ('`spark_catalog`.`default`.`Customer_Data`'). The associated location ('...
4 More Replies
- 4889 Views
- 2 replies
- 1 kudos
- 4889 Views
- 2 replies
- 1 kudos
Latest Reply
The above code will create two jobs.JOB-1. dataframe: DataFrame = spark.createDataFrame(data=data,schema=schema)The createDataFrame function is responsible for inferring the schema from the provided data or using the specified schema.Depending on the...
1 More Replies
- 3858 Views
- 3 replies
- 0 kudos
How do engineering teams out there version control their jobs? If there is a production issue, can I revert to an older version of the job?
- 3858 Views
- 3 replies
- 0 kudos
Latest Reply
You can use version controlled source code for you databricks job and each time you need to rollback to older version of your job you need just to move to older version code. For version controlled source code you have multiple choises:- Use a noteb...
2 More Replies
by
Ludo
• New Contributor III
- 7083 Views
- 7 replies
- 2 kudos
Hello,This is question on our platform with `Databricks Runtime 11.3 LTS`.I'm running a Job with multiple tasks in // using a shared cluster.Each task runs a dedicated scala class within a JAR library attached as a dependency.One of the task fails (c...
- 7083 Views
- 7 replies
- 2 kudos
Latest Reply
Hi,This actually should not be marked as solved. We are having the same problem, whenever a Shared Job Cluster crashes for some reason (generally OoM), all tasks will start failing until eternity, with the error message as described above. This is ac...
6 More Replies
- 810 Views
- 1 replies
- 0 kudos
Hi! When I use `databricks jobs list --version=2.0` I get all jobs deployed using 2.0 and 2.1 API, however, when I use `databricks jobs list --version=2.1` I only get jobs deployed using 2.1 API. This is a behaviour that we've only experienced recent...
- 810 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Guillermo Sanchez Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
by
Eelke
• New Contributor II
- 2744 Views
- 2 replies
- 2 kudos
This seems impossible with the cron that databricks is using but maybe I am wrong? However, if this is not possible it seems to me a missing feature, and thereby would like to suggest this feature to you
- 2744 Views
- 2 replies
- 2 kudos
Latest Reply
Hi @Eelke van Foeken Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...
1 More Replies
- 1420 Views
- 1 replies
- 3 kudos
I am trying to start the same Jobs multiple times using the python sdk's "run_now" command.If the number of requests exceeds the Maximum concurrent runs, the status of the run will be Skipped and the run will not be executed.Is there any way to queue...
- 1420 Views
- 1 replies
- 3 kudos
Latest Reply
Hi, We do have a private preview feature which will be enabled shortly for queueing. Please tag me (@Debayan Mukherjee ) with your next update so that I will get notified.
- 7333 Views
- 3 replies
- 3 kudos
Hi allI have a need to migrate just notebooks & jobs from one workspace to another. Is there an utility to do so?
- 7333 Views
- 3 replies
- 3 kudos
Latest Reply
@NOOR BASHA SHAIK you can also try databricks connect in addition to @Artem Sheiko information that was provided .
2 More Replies