a month ago
I'm exploring whether serverless (https://docs.databricks.com/en/jobs/run-serverless-jobs.html#create-a-job-using-serverless-compute) could be useful for our use case. I'd like to see an example of using serverless via the API. The docs say "To learn about using the Jobs API to create and run jobs that use serverless compute, see Jobs in the REST API reference". But that's just reference info, not a tutorial. Is there any tutorial-style documentation showing how to start a job using serverless?
a month ago
>this by default will make the job serverless
Aha, very interesting. Do the reference docs (https://docs.databricks.com/api/workspace/jobs_21/create#tasks) state that? If not, can y'all add an explicit mention in the docs?
a month ago
a month ago
What specifically you need, is it how you define the job to run with serverless? If this is the case, when creating a job if you need to use serverless you just dont specify cluster configuration, this by default will make the job serverless
a month ago
To create a serverless job using the API, you no longer need to specify one of new_cluster, existing_cluster_id, or job_cluster_key in each task. Instead, only tasks with task_key and your task to run is required. Here is an example of how you can create a serverless job:
{
"name": "Serverless Job",
"tasks": [
{
"task_key": "My_task",
"python_wheel_task": {
"package_name": "databricks_jaws",
"entry_point": "run_analysis",
"named_parameters": {
"dry_run": "true"
}
},
"environment_key": "my-serverless-compute"
}
],
"tags": {
"department": "sales"
},
"environments": [
{
"environment_key": "default",
"spec": {
"client": "1",
"dependencies": [
"/Volumes/<catalog>/<schema>/<volume>/<path>.whl",
"/Workspace/my_project/dist.whl",
"simplejson",
"-r /Workspace/my_project/requirements.txt"
]
}
}
]
}
In this example, the job is named "Serverless Job" and it has a single task with the key "My_task". The task is a Python wheel task that runs the "run_analysis" function from the "databricks_jaws" package. The task is run in the "my-serverless-compute" environment. The job also has a tag indicating that it belongs to the "sales" department.
a month ago
>this by default will make the job serverless
Aha, very interesting. Do the reference docs (https://docs.databricks.com/api/workspace/jobs_21/create#tasks) state that? If not, can y'all add an explicit mention in the docs?
a month ago
I will share this feedback as this is currently not stated there
a month ago
>I will share this feedback as this is currently not stated there
Awesome, thanks!
One follow-up question: is it possible to provide a Docker image as part of serverless? (It appears to be a feature of a *cluster*, not a *job*, so I think the answer is no; however, I just want to make sure I've understood).
a month ago
No, as of now it is not possible
a month ago
Thanks!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group