Here is the policy I just created:
{
"node_type_id": {
"defaultValue": "Standard_D8s_v3",
"type": "allowlist",
"values": [
"Standard_D8s_v3",
"Standard_D16s_v3"
]
},
"num_workers": {
"hidden": false,
"type": "fixed",
"value": 1
},
"cluster_type": {
"hidden": false,
"type": "fixed",
"value": "all-purpose"
},
"azure_attributes.availability": {
"type": "fixed",
"value": "SPOT_WITH_FALLBACK_AZURE",
"hidden": true
}
}
Notice a I make the cluster_type value fixed as “all-purpose”. After the policy been created, I continue try to create a job under this policy
# The main job for asset_bundle_workflow_demo.
resources:
jobs:
asset_bundle_workflow_demo_job:
name: asset_bundle_workflow_demo_job
schedule:
# Run every day at 8:37 AM
quartz_cron_expression: '44 37 8 * * ?'
timezone_id: Europe/Amsterdam
tasks:
- task_key: notebook_task
job_cluster_key: my_job_cluster
notebook_task:
notebook_path: ../src/notebook.ipynb
job_clusters:
- job_cluster_key: my_job_cluster
new_cluster:
policy_id: xxxxxxxxxxxxxxx
num_workers: 1
spark_version: 15.4.x-scala2.12
node_type_id: Standard_D8s_v3
cluster_type: all-purpose
azure_attributes:
availability: SPOT_WITH_FALLBACK_AZURE
And here the error message I got whe try to deploy the asset bundle:
Warning: unknown field: cluster_type
at resources.jobs.asset_bundle_workflow_demo_job.job_clusters[0].new_cluster
in resources/asset_bundle_workflow_demo_job.yml:35:13
Error: terraform apply: exit status 1
Error: cannot create job: Cluster validation error: Validation failed for cluster_type, the value must be all-purpose (is "job")
with databricks_job.asset_bundle_workflow_demo_job,
on bundle.tf.json line 63, in resource.databricks_job.asset_bundle_workflow_demo_job:
63: }
I’m confused because I just set the cluster_type value in the new_cluster section. What is the wrong with my code?