cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jpwp
by New Contributor III
  • 17459 Views
  • 11 replies
  • 3 kudos

Resolved! How to specify entry_point for python_wheel_task?

Can someone provide me an example for a python_wheel_task and what the entry_point field should be?The jobs UI help popup says this about "entry_point":"Function to call when starting the wheel, for example: main. If the entry point does not exist in...

  • 17459 Views
  • 11 replies
  • 3 kudos
Latest Reply
hectorfi
New Contributor III
  • 3 kudos

Just in case anyone comes here in the future, this is kind of how Databricks executes these entry points... How I know? I have banged my head against this wall for a couple of hours already.from importlib import metadata package_name = "some.package...

  • 3 kudos
10 More Replies
guille-ci
by New Contributor
  • 383 Views
  • 1 replies
  • 0 kudos

[bug] databricks jobs list not desplaying 2.0 created jobs

Hi! When I use `databricks jobs list --version=2.0` I get all jobs deployed using 2.0 and 2.1 API, however, when I use `databricks jobs list --version=2.1` I only get jobs deployed using 2.1 API. This is a behaviour that we've only experienced recent...

  • 383 Views
  • 1 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Guillermo Sanchez​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 0 kudos
Matt1209
by New Contributor II
  • 509 Views
  • 1 replies
  • 3 kudos

How to execute requests later for a number of times that exceeds the Maximum concurrent runs?

I am trying to start the same Jobs multiple times using the python sdk's "run_now" command.If the number of requests exceeds the Maximum concurrent runs, the status of the run will be Skipped and the run will not be executed.Is there any way to queue...

  • 509 Views
  • 1 replies
  • 3 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 3 kudos

Hi, We do have a private preview feature which will be enabled shortly for queueing. Please tag me (@Debayan Mukherjee​ ) with your next update so that I will get notified.

  • 3 kudos
j_al
by New Contributor II
  • 1949 Views
  • 9 replies
  • 5 kudos

Jobs API 2.1 OpenAPI specification seems broken.

Jobs API 2.1 OpenAPI specification seems broken.The swagger file seems to be invalid.https://docs.databricks.com/_extras/api-refs/jobs-2.1-aws.yaml

  • 1949 Views
  • 9 replies
  • 5 kudos
Latest Reply
JeffShutt_
New Contributor II
  • 5 kudos

@Debayan Mukherjee​ , are you suggesting to revert the openapi version specified in https://docs.databricks.com/_extras/api-refs/jobs-2.1-aws.yaml from 3.1.0 to 3.0.3?

  • 5 kudos
8 More Replies
Pritam
by New Contributor II
  • 1562 Views
  • 3 replies
  • 0 kudos

Not able create Job via Jobs api in databricks

I am not able to create jobs via jobs API in databricks.Error=INVALID_PARAMETER_VALUE: Job settings must be specified.I simply copied the JSON file and saved it. Loaded the same JSON file and tried to create the job via API but the got the above erro...

  • 1562 Views
  • 3 replies
  • 0 kudos
Latest Reply
rAlex
New Contributor II
  • 0 kudos

@Pritam Arya​  I had the same problem today. In order to use the JSON that you can get from the GUI in an existing job, in a request to the Jobs API, you want to use just the JSON that is the value of the settings key.

  • 0 kudos
2 More Replies
keenan_jones7
by New Contributor II
  • 8631 Views
  • 3 replies
  • 5 kudos

Cannot create job through Jobs API

import requests import json instance_id = 'abcd.azuredatabricks.net' api_version = '/api/2.0' api_command = '/jobs/create' url = f"https://{instance_id}{api_version}{api_command}" headers = {'Authorization': 'Bearer myToken'} params = { "settings...

  • 8631 Views
  • 3 replies
  • 5 kudos
Latest Reply
rAlex
New Contributor II
  • 5 kudos

@keenan_jones7​ I had the same problem today. It looks like you've copied and pasted the JSON that Databricks displays in the GUI when you select View JSON from the dropdown menu when viewing a job.In order to use that JSON in a request to the Jobs ...

  • 5 kudos
2 More Replies
rsamant07
by New Contributor III
  • 2744 Views
  • 11 replies
  • 2 kudos

Resolved! DBT Job Type Authenticating to Azure Devops for git_source

we are trying to execute the databricks jobs for dbt task type but it is failing to autheticate to git. Problem is job is created using service principal but service principal don't seem to have access to the repo. few questions we have:1) can we giv...

  • 2744 Views
  • 11 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Rahul Samant​ I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest p...

  • 2 kudos
10 More Replies
grazie
by Contributor
  • 1083 Views
  • 3 replies
  • 2 kudos

Do you need to be workspace admin to create jobs?

We're using a setup where we use gitlab ci to deploy workflows using a service principal, using the Jobs API (2.1) https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsCreateWhen we wanted to reduce permissions of the ci to minimu...

  • 1083 Views
  • 3 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Geir Iversen​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...

  • 2 kudos
2 More Replies
shan_chandra
by Honored Contributor III
  • 959 Views
  • 1 replies
  • 1 kudos

Resolved! Adding spark_conf tag on Jobs API

using Jobs API, when we create a new job to run on an interactive cluster, can we add spark_conf tag and specify spark config tuning parameters?

  • 959 Views
  • 1 replies
  • 1 kudos
Latest Reply
shan_chandra
Honored Contributor III
  • 1 kudos

 spark_conf needs to be set prior to the start of the cluster or have to restart the existing cluster. Hence, the spark_conf tag is available only on the job_cluster. you may have to set the configs manually on the interactive cluster prior to using ...

  • 1 kudos
spott_submittab
by New Contributor II
  • 560 Views
  • 1 replies
  • 0 kudos

A Job "pool"? (or task pool)

I'm trying to run a single job multiple times with different parameters where the number of concurrent jobs is less than the number of parameters.I have a job (or task...) J that takes parameter set p, I have 100 p values I want to run, however I onl...

  • 560 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

this is something new,interesting question, try to reach out databricks support team, maybe they have some good idea here

  • 0 kudos
apayne
by New Contributor III
  • 1445 Views
  • 1 replies
  • 4 kudos

Databricks Jobs API not returning notebook run results?

Calling a databricks notebook using the Rest API, can confirm that it is executing the notebook, but is not accepting my parameters or returning a notebook output. Any ideas on what I am doing wrong here?My code and notebook function are below, tryin...

view view2
  • 1445 Views
  • 1 replies
  • 4 kudos
Latest Reply
apayne
New Contributor III
  • 4 kudos

Resolved this by using dbutils within the notebook being called from the API.# databricks notebook function   data = dbutils.widgets.get('data') # pulls base_parameters from API call   def add_test(i): result = i + ' COMPLETE' return result  ...

  • 4 kudos
johnny1
by New Contributor II
  • 1055 Views
  • 2 replies
  • 0 kudos

Why it still complain REST API version is 2.0 even though set it to 2.1 ?

root@387ece6d15b2:/usr/workspace# databricks --versionVersion 0.17.3root@387ece6d15b2:/usr/workspace# databricks jobs configure --version=2.1root@387ece6d15b2:/usr/workspace# databricks jobs get --job-id 123WARN: Your CLI is configured to use Jobs AP...

  • 1055 Views
  • 2 replies
  • 0 kudos
Latest Reply
johnny1
New Contributor II
  • 0 kudos

Command "databricks jobs configure --version=2.1" not work.workaround with adding option "--version=2.1" to each databricks jobs/runs command .It is not very convenient.

  • 0 kudos
1 More Replies
swetha
by New Contributor III
  • 1394 Views
  • 3 replies
  • 4 kudos

Resolved! Retrieving the job-id's of a notebook running inside tasks

I have created a job, Inside a job I have created tasks which are independent, I have used the concept of concurrent futures to exhibit parallelism and in each task there are couple of notebooks running(which are independent) Each notebook running ha...

  • 1394 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @swetha kadiyala​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 4 kudos
2 More Replies
umarkhan
by New Contributor II
  • 1912 Views
  • 4 replies
  • 1 kudos

Driver context not found for python spark for spark_submit_task using Jobs API submit run endpoint

I am trying to run a multi file python job in databricks without using notebooks. I have tried setting this up by:creating a docker image using the DBRT 10.4 LTS as a base and adding the zipped python application to that.make a call to the run submit...

  • 1912 Views
  • 4 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @Umar Khan​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!

  • 1 kudos
3 More Replies
Labels