cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Can we parameterize the compute in job cluster

NarenderKumar
New Contributor III

I have created a workflow job in databricks with job parameters.

I want to run the job same with different workloads and data volume.

So I want the compute cluster to be parametrized so that I can pass the compute requirements(driver, executor size and number of nodes) dynamically when I run the job.

Is this possible in databricks?

2 REPLIES 2

raphaelblg
Honored Contributor
Honored Contributor

Hi @NarenderKumar , If you want to change an existing job compute you would have to update the job settings before triggering a new run. Feel free to open a feature request with your idea through the Databricks Ideas Portal.

 

Best regards,

Raphael Balogo
Sr. Technical Solutions Engineer
Databricks

brockb
Valued Contributor
Valued Contributor

Hi @NarenderKumar ,

Have you considered leveraging autoscaling for the existing cluster?

If this does not meet your needs, are the differing volume/workloads known in advance? If so, could different compute be provisioned using Infrastructure as Code based on the differing workload characteristics? Here's a doc on using Terraform with Databricks: https://docs.databricks.com/en/dev-tools/terraform/index.html

Thank you. 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group