02-17-2022 11:54 AM
Hello,
We enabled the 2.1 jobs api feature and when I attempt to create a "shared" job cluster in the configuration I always get this response:
{'error_code': 'FEATURE_DISABLED', 'message': 'Shared job cluster feature is not enabled.'}
Please could you help.
03-17-2022 01:50 AM
Hi @Jed Lechner , Would you like to raise a support ticket here?
Databricks offers a number of plans that provide you with dedicated support and timely service for the Databricks platform and Apache Spark.
If your organisation does not have a Databricks support subscription, or if you are not an authorised contact for your company’s support subscription, you can find answers to many questions on the Databricks Help Centre.
If you are already an authorised Databricks support contact for your organisation, this article shows you how to manage the support process.
02-17-2022 01:25 PM
Hi @Jed Lechner ! My name is Kaniz, and I'm the technical moderator here. Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else I will get back to you soon. Thanks.
02-22-2022 11:52 PM
Hi @Jed Lechner , A shared job cluster allows multiple tasks in the same job run to reuse the cluster. You can use a single job cluster to run all tasks that are part of the job or multiple job clusters optimized for specific workloads. To use a shared job cluster:
A shared job cluster is scoped to a single job run, and cannot be used by other jobs or runs of the same job.
Libraries cannot be declared in a shared job cluster configuration. You must add dependent libraries in task settings.
03-08-2022 05:42 AM
Hi @Jed Lechner are you able to access the same with 2.0 API?
03-08-2022 05:55 AM
I am able to access now. To summarize the problem from my perspective the shared cluster API did not work as expected. Some direct manual intervention by databricks support was required.
03-17-2022 01:50 AM
Hi @Jed Lechner , Would you like to raise a support ticket here?
Databricks offers a number of plans that provide you with dedicated support and timely service for the Databricks platform and Apache Spark.
If your organisation does not have a Databricks support subscription, or if you are not an authorised contact for your company’s support subscription, you can find answers to many questions on the Databricks Help Centre.
If you are already an authorised Databricks support contact for your organisation, this article shows you how to manage the support process.