- 1756 Views
- 2 replies
- 1 kudos
I have a azure databricks job and it's triggered via ADF using a API call. I want see why the job has been taking n minutes to complete the tasks. When the job execution results, The job execution time says 15 mins and the individual cells/commands d...
- 1756 Views
- 2 replies
- 1 kudos
Latest Reply
Hey there @DineshKumar Does @Prabakar Ammeappin's response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly? Else please let us know if you need more help. Cheers!
1 More Replies
- 3565 Views
- 7 replies
- 4 kudos
I need to find out all jobs which are currently running and not get other jobsThe below command fetches all the jobscurl --location --request GET 'https://xxxxxx.gcp.databricks.com/api/2.1/jobs/list?active_only=true&expand_tasks=true&run_type=JOB_RUN...
- 3565 Views
- 7 replies
- 4 kudos
Latest Reply
Hi @Sumit Rohatgi It seems like active_only=true only applies to jobs/runs/list API and not to jobs/list.Can you please try the jobs/runs/list API?
6 More Replies
- 1030 Views
- 2 replies
- 0 kudos
I have a job with multiple tasks like Task1 -> Task2 -> Task3. I am trying to call the job using api "run now". Task details are belowTask1 - It executes a Note Book with some input parametersTask2 - It runs using "ABC.jar", so its a jar based task ...
- 1030 Views
- 2 replies
- 0 kudos
Latest Reply
@Rama Krishna N you can refer here https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsRunNow"jar_params": [
"john",
"doe",
"35"
],
"notebook_params": {
"name": "john doe",
"age": "35"
},
1 More Replies
- 3062 Views
- 1 replies
- 2 kudos
Databricks jobs create API throws unexpected errorError response :{"error_code": "INVALID_PARAMETER_VALUE","message": "Cluster validation error: Missing required field: settings.cluster_spec.new_cluster.size"}Any idea on this?
- 3062 Views
- 1 replies
- 2 kudos
Latest Reply
Could you please specify num_workers in the json body and try API again.Also, another recommendation can be configuring what you want in UI, and then pressing “JSON” button that should show corresponding JSON which you can use for API
by
Sunny
• New Contributor III
- 631 Views
- 1 replies
- 0 kudos
I am having a workflow with a task that is dependant on external application execution (not residing in Databricks). After external application finishes, how to update the status of a task to complete. Currently, Jobs API doesn't support status updat...
- 631 Views
- 1 replies
- 0 kudos
Latest Reply
Sunny
New Contributor III
Any inputs on this one please
- 2304 Views
- 9 replies
- 12 kudos
I'm trying to query delta tables using JDBC connector in a Ruby app. I've noticed that it takes around 8 seconds just to connect with databricks cluster and then additional time to run the query.The app is connected to a web portal where users genera...
- 2304 Views
- 9 replies
- 12 kudos
Latest Reply
Hi @Aman Sehgal Could you please check SQL endpoints? SQL endpoint uses a photon engine. It can reduce the query processing time. And Serverless SQL endpoint can accelerate the launch timemore info: https://docs.databricks.com/sql/admin/sql-endpoin...
8 More Replies
- 1673 Views
- 2 replies
- 0 kudos
Hi! this is my CI configuration, I added the databricks jobs configure --version=2.1 command but it stills showing this error, any idea of what can I be doing wrong?Error:Resetting Databricks Job with job_id 1036...WARN: Your CLI is configured to use...
- 1673 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Alejandro Martinez , To set up and use the Databricks jobs CLI (and the job runs CLI) to call the Jobs REST API 2.1, Update the CLI to version 0.16.0 or above.Run pip install databricks-cli --upgrade using the appropriate version of pip for your...
1 More Replies
by
Junee
• New Contributor III
- 2957 Views
- 7 replies
- 3 kudos
I am using Databeicks Job Api 2.1 to trigger and run my jobs. "jobs/runs/submit" this API helps in starting the cluster, as well as create the job and run it. This API works great for normal jobs as it also cleans the cluster once job is finished suc...
- 2957 Views
- 7 replies
- 3 kudos
Latest Reply
@Junee, Anytime! It is crisply mentioned in the doc too. https://docs.databricks.com/clusters/index.html
6 More Replies
- 1351 Views
- 1 replies
- 0 kudos
Is there a setting that will auto-cleanup/delete jobs that are of a certain age (say 90 days old for example)?
- 1351 Views
- 1 replies
- 0 kudos
Latest Reply
It is not available natively in Databricks. But you can write an administration script that analyzes your jobs data and automatically cleans up the older jobs as needed. It would be easiest to do this with the jobs API. List your jobs to get all the ...
by
aladda
• Honored Contributor II
- 629 Views
- 1 replies
- 0 kudos
I see the revision_timestamp paramater on NotebookTask https://docs.databricks.com/dev-tools/api/latest/jobs.html#jobsnotebooktask. An example of how to invoke it would be helpful
- 629 Views
- 1 replies
- 0 kudos
Latest Reply
You can use the databricks built in version control feature, coupled with the NotebookTask Jobs API to specify a specific version of the notebook based on the timestamp of the save defined in unix timestamp formatcurl -n -X POST -H 'Content-Type: app...