cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Jobs REST Api - Create new Job with a new Cluster, and install a Maven Library on the Cluster

antoniodavideca
New Contributor III

I would need to use the Job REST API to create a Job on our databrick Cluster.

At the Job Creation, is possible to specify an existing cluster, or, create a new one.

I can forward alot of information to the Cluster, but what I would like to specify is a bunch of Maven Library to install directly on the Cluster itself.

Would be possible?

I don't see this configuration.

Alternative I found is use

Cluster API -> Create new Cluster

Library API -> Install a new Library on Cluster

Jobs API -> Create a new Job and run on the Cluster that I just created, but it feels a little bit cumbersome

2 REPLIES 2

Prabakar
Esteemed Contributor III
Esteemed Contributor III

@Antonio Davide Cali​ You can use the existing cluster in your json to use it for the job.

imageTo update or push libraries to the job, you can use the JobsUpdate API. As you want to push libraries to the cluster, you can push them using the new setting and add your libraries there.

image 

Yes I ended up with this solution.

Re-reading the Docs, (not super clear honestly regarding this), when you forward a `git_source` field, you also have to create a Task inside Tasks as `notebook_task`.

My confusion was because I thought that `Tasks` was not needed when `git_source` was present, and the field `libraries` was present only at Tasks level 🙂

Thank you

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.