cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Pritam
by New Contributor II
  • 2789 Views
  • 4 replies
  • 1 kudos

Not able create Job via Jobs api in databricks

I am not able to create jobs via jobs API in databricks.Error=INVALID_PARAMETER_VALUE: Job settings must be specified.I simply copied the JSON file and saved it. Loaded the same JSON file and tried to create the job via API but the got the above erro...

  • 2789 Views
  • 4 replies
  • 1 kudos
Latest Reply
rAlex
New Contributor III
  • 1 kudos

@Pritam Arya​  I had the same problem today. In order to use the JSON that you can get from the GUI in an existing job, in a request to the Jobs API, you want to use just the JSON that is the value of the settings key.

  • 1 kudos
3 More Replies
j_al
by New Contributor II
  • 4060 Views
  • 10 replies
  • 5 kudos

Jobs API 2.1 OpenAPI specification seems broken.

Jobs API 2.1 OpenAPI specification seems broken.The swagger file seems to be invalid.https://docs.databricks.com/_extras/api-refs/jobs-2.1-aws.yaml

  • 4060 Views
  • 10 replies
  • 5 kudos
Latest Reply
JeffShutt_
New Contributor II
  • 5 kudos

@Debayan Mukherjee​ , are you suggesting to revert the openapi version specified in https://docs.databricks.com/_extras/api-refs/jobs-2.1-aws.yaml from 3.1.0 to 3.0.3?

  • 5 kudos
9 More Replies
RKNutalapati
by Valued Contributor
  • 1503 Views
  • 3 replies
  • 0 kudos

Jobs API "run now" - How to set task wise parameters

I have a job with multiple tasks like Task1 -> Task2 -> Task3. I am trying to call the job using api "run now". Task details are belowTask1 - It executes a Note Book with some input parametersTask2 - It runs using "ABC.jar", so its a jar based task ...

  • 1503 Views
  • 3 replies
  • 0 kudos
Latest Reply
Harsha777
New Contributor III
  • 0 kudos

Hi,It would be a good feature to pass parameters at task level. We have scenarios where we would like to create a job with multiple tasks (notebook/dbt) and pass parameters at task level.

  • 0 kudos
2 More Replies
jpwp
by New Contributor III
  • 20927 Views
  • 12 replies
  • 8 kudos

Resolved! How to specify entry_point for python_wheel_task?

Can someone provide me an example for a python_wheel_task and what the entry_point field should be?The jobs UI help popup says this about "entry_point":"Function to call when starting the wheel, for example: main. If the entry point does not exist in...

  • 20927 Views
  • 12 replies
  • 8 kudos
Latest Reply
hectorfi
New Contributor III
  • 8 kudos

Just in case anyone comes here in the future, this is kind of how Databricks executes these entry points... How I know? I have banged my head against this wall for a couple of hours already.from importlib import metadata package_name = "some.package...

  • 8 kudos
11 More Replies
guille-ci
by New Contributor
  • 640 Views
  • 1 replies
  • 0 kudos

[bug] databricks jobs list not desplaying 2.0 created jobs

Hi! When I use `databricks jobs list --version=2.0` I get all jobs deployed using 2.0 and 2.1 API, however, when I use `databricks jobs list --version=2.1` I only get jobs deployed using 2.1 API. This is a behaviour that we've only experienced recent...

  • 640 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Guillermo Sanchez​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
Matt1209
by New Contributor II
  • 905 Views
  • 1 replies
  • 3 kudos

How to execute requests later for a number of times that exceeds the Maximum concurrent runs?

I am trying to start the same Jobs multiple times using the python sdk's "run_now" command.If the number of requests exceeds the Maximum concurrent runs, the status of the run will be Skipped and the run will not be executed.Is there any way to queue...

  • 905 Views
  • 1 replies
  • 3 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 3 kudos

Hi, We do have a private preview feature which will be enabled shortly for queueing. Please tag me (@Debayan Mukherjee​ ) with your next update so that I will get notified.

  • 3 kudos
keenan_jones7
by New Contributor II
  • 10209 Views
  • 3 replies
  • 5 kudos

Cannot create job through Jobs API

import requests import json instance_id = 'abcd.azuredatabricks.net' api_version = '/api/2.0' api_command = '/jobs/create' url = f"https://{instance_id}{api_version}{api_command}" headers = {'Authorization': 'Bearer myToken'} params = { "settings...

  • 10209 Views
  • 3 replies
  • 5 kudos
Latest Reply
rAlex
New Contributor III
  • 5 kudos

@keenan_jones7​ I had the same problem today. It looks like you've copied and pasted the JSON that Databricks displays in the GUI when you select View JSON from the dropdown menu when viewing a job.In order to use that JSON in a request to the Jobs ...

  • 5 kudos
2 More Replies
rsamant07
by New Contributor III
  • 4634 Views
  • 11 replies
  • 2 kudos

Resolved! DBT Job Type Authenticating to Azure Devops for git_source

we are trying to execute the databricks jobs for dbt task type but it is failing to autheticate to git. Problem is job is created using service principal but service principal don't seem to have access to the repo. few questions we have:1) can we giv...

  • 4634 Views
  • 11 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Rahul Samant​ I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest p...

  • 2 kudos
10 More Replies
grazie
by Contributor
  • 1831 Views
  • 3 replies
  • 2 kudos

Do you need to be workspace admin to create jobs?

We're using a setup where we use gitlab ci to deploy workflows using a service principal, using the Jobs API (2.1) https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsCreateWhen we wanted to reduce permissions of the ci to minimu...

  • 1831 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Geir Iversen​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...

  • 2 kudos
2 More Replies
shan_chandra
by Esteemed Contributor
  • 2149 Views
  • 1 replies
  • 1 kudos

Resolved! Adding spark_conf tag on Jobs API

using Jobs API, when we create a new job to run on an interactive cluster, can we add spark_conf tag and specify spark config tuning parameters?

  • 2149 Views
  • 1 replies
  • 1 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 1 kudos

 spark_conf needs to be set prior to the start of the cluster or have to restart the existing cluster. Hence, the spark_conf tag is available only on the job_cluster. you may have to set the configs manually on the interactive cluster prior to using ...

  • 1 kudos
spott_submittab
by New Contributor II
  • 892 Views
  • 1 replies
  • 0 kudos

A Job "pool"? (or task pool)

I'm trying to run a single job multiple times with different parameters where the number of concurrent jobs is less than the number of parameters.I have a job (or task...) J that takes parameter set p, I have 100 p values I want to run, however I onl...

  • 892 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

this is something new,interesting question, try to reach out databricks support team, maybe they have some good idea here

  • 0 kudos
apayne
by New Contributor III
  • 2481 Views
  • 1 replies
  • 4 kudos

Databricks Jobs API not returning notebook run results?

Calling a databricks notebook using the Rest API, can confirm that it is executing the notebook, but is not accepting my parameters or returning a notebook output. Any ideas on what I am doing wrong here?My code and notebook function are below, tryin...

view view2
  • 2481 Views
  • 1 replies
  • 4 kudos
Latest Reply
apayne
New Contributor III
  • 4 kudos

Resolved this by using dbutils within the notebook being called from the API.# databricks notebook function   data = dbutils.widgets.get('data') # pulls base_parameters from API call   def add_test(i): result = i + ' COMPLETE' return result  ...

  • 4 kudos
johnny1
by New Contributor II
  • 1577 Views
  • 2 replies
  • 0 kudos

Why it still complain REST API version is 2.0 even though set it to 2.1 ?

root@387ece6d15b2:/usr/workspace# databricks --versionVersion 0.17.3root@387ece6d15b2:/usr/workspace# databricks jobs configure --version=2.1root@387ece6d15b2:/usr/workspace# databricks jobs get --job-id 123WARN: Your CLI is configured to use Jobs AP...

  • 1577 Views
  • 2 replies
  • 0 kudos
Latest Reply
johnny1
New Contributor II
  • 0 kudos

Command "databricks jobs configure --version=2.1" not work.workaround with adding option "--version=2.1" to each databricks jobs/runs command .It is not very convenient.

  • 0 kudos
1 More Replies
swetha
by New Contributor III
  • 2217 Views
  • 3 replies
  • 4 kudos

Resolved! Retrieving the job-id's of a notebook running inside tasks

I have created a job, Inside a job I have created tasks which are independent, I have used the concept of concurrent futures to exhibit parallelism and in each task there are couple of notebooks running(which are independent) Each notebook running ha...

  • 2217 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @swetha kadiyala​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 4 kudos
2 More Replies
Labels