cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Can we parameterize the compute in job cluster

NarenderKumar
New Contributor III

I have created a workflow job in databricks with job parameters.

I want to run the job same with different workloads and data volume.

So I want the compute cluster to be parametrized so that I can pass the compute requirements(driver, executor size and number of nodes) dynamically when I run the job.

Is this possible in databricks?

2 REPLIES 2

raphaelblg
Contributor III
Contributor III

Hi @NarenderKumar , If you want to change an existing job compute you would have to update the job settings before triggering a new run. Feel free to open a feature request with your idea through the Databricks Ideas Portal.

 

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

brockb
Contributor III
Contributor III

Hi @NarenderKumar ,

Have you considered leveraging autoscaling for the existing cluster?

If this does not meet your needs, are the differing volume/workloads known in advance? If so, could different compute be provisioned using Infrastructure as Code based on the differing workload characteristics? Here's a doc on using Terraform with Databricks: https://docs.databricks.com/en/dev-tools/terraform/index.html

Thank you. 

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!