Specifying cluster on running a job
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā04-04-2023 01:11 AM
Hi,
Let's say that I am starting jobs with different parameters at a certain time each day in the following manner:
response = requests.post(
"https://%s/api/2.0/jobs/run-now" % (DOMAIN),
headers={"Authorization": "Bearer %s" % TOKEN}, json={
"job_id": job_id,
"notebook_params": {
"country_name": str(country_id),
}
})
I was wondering how I could go about specifying a specific cluster size for a run of a workflow? And how do you specify that the cluster should be shared among the tasks in the workflow? This could be interesting when you have one country_id for which a bigger cluster is needed compared to all other countries and other similar use-cases.
Thanks in advance.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā04-04-2023 09:13 AM
@Tjadi Peetersā You can select option Autoscaling/Enhanced Scaling in workflows which will scale based on workload
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
ā04-04-2023 09:42 AM
Thanks for your reply. The autoscaling the functionality I am aware of only scales the amount of workers - or is there another one? I am looking to start jobs with different types of workers (i.e. one of the jobs starts with a m5d.2xlarge while the other has m5d.4xlarge).