cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Enabled 2.1 jobs api feature and unable to create a shared jobs cluster.

Jed
New Contributor II

Hello,

We enabled the 2.1 jobs api feature and when I attempt to create a "shared" job cluster in the configuration I always get this response:

{'error_code': 'FEATURE_DISABLED', 'message': 'Shared job cluster feature is not enabled.'}

Please could you help.

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz
Community Manager
Community Manager

Hi @Jed Lechner​ , Would you like to raise a support ticket here?

Databricks offers a number of plans that provide you with dedicated support and timely service for the Databricks platform and Apache Spark.

If your organisation does not have a Databricks support subscription, or if you are not an authorised contact for your company’s support subscription, you can find answers to many questions on the Databricks Help Centre.

If you are already an authorised Databricks support contact for your organisation, this article shows you how to manage the support process.

View solution in original post

5 REPLIES 5

Kaniz
Community Manager
Community Manager

Hi @Jed Lechner​ ! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.

Kaniz
Community Manager
Community Manager

Hi @Jed Lechner​ , A shared job cluster allows multiple tasks in the same job run to reuse the cluster. You can use a single job cluster to run all tasks that are part of the job or multiple job clusters optimized for specific workloads. To use a shared job cluster:

  1. Select New Job Clusters when you create a task and complete the cluster configuration.
  2. Select the new cluster when adding a task to the job, or create a new job cluster. Any cluster you configure when you select New Job Clusters is available to any task in the job.

A shared job cluster is scoped to a single job run, and cannot be used by other jobs or runs of the same job.

Libraries cannot be declared in a shared job cluster configuration. You must add dependent libraries in task settings.

Link to the Doc

Atanu
Esteemed Contributor
Esteemed Contributor

Hi @Jed Lechner​  are you able to access the same with 2.0 API?

Jed
New Contributor II

I am able to access now. To summarize the problem from my perspective the shared cluster API did not work as expected. Some direct manual intervention by databricks support was required.

Kaniz
Community Manager
Community Manager

Hi @Jed Lechner​ , Would you like to raise a support ticket here?

Databricks offers a number of plans that provide you with dedicated support and timely service for the Databricks platform and Apache Spark.

If your organisation does not have a Databricks support subscription, or if you are not an authorised contact for your company’s support subscription, you can find answers to many questions on the Databricks Help Centre.

If you are already an authorised Databricks support contact for your organisation, this article shows you how to manage the support process.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.