I understand that, in your case, auto-scaling will take too much time.
The simplest option is to use a different cluster for another notebook (and be sure that the previous cluster is terminated instantly).
Another option is to use REST API 2.0/clusters/resize to resize the cluster https://docs.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/clusters#--resize
There is also a magic option to do it from a notebook, and I am including a script detecting all required parameters.
import requests
ctx = dbutils.notebook.entry_point.getDbutils().notebook().getContext()
domain_name = ctx.tags().get("browserHostName").get()
cluster_id = ctx.clusterId().get()
host_token = ctx.apiToken().get()
requests.post(
f'https://{domain_name}/api/2.0/clusters/resize',
headers={'Authorization': f'Bearer {host_token}'},
json={ "cluster_id": cluster_id, "num_workers": 2 }
)