โ06-13-2024 10:47 PM
I am new to Databricks. I am trying to do the lab work given Databricks DataEngineering course, at workbook 4.1 I am getting below error. Please help to resolve.
Expected one and only one cluster definition.
โ06-13-2024 11:59 PM
Hereโs how you can edit the JSON configuration:
Access the Cluster Configuration:
Edit the Configuration:
Edit the JSON Interface:
{
"clusters": [
{
"cluster_name": "Cluster1",
"spark_version": "7.3.x-scala2.12",
"node_type_id": "i3.xlarge",
"num_workers": 2
},
{
"cluster_name": "Cluster2",
"spark_version": "7.3.x-scala2.12",
"node_type_id": "i3.xlarge",
"num_workers": 2
}
]
}
โ06-15-2024 07:37 PM
Thanks Rishabh Pandey for your reply. But when I edit my cluster in JSON it has only one cluster definition as below.
{
"num_workers": 0,
"cluster_name": "Abdul Subahan Mohamed Ghouse's Cluster",
"spark_version": "11.3.x-scala2.12",
"spark_conf": {
"spark.master": "local[*, 4]",
"spark.databricks.cluster.profile": "singleNode"
},
"aws_attributes": {
"first_on_demand": 1,
"availability": "SPOT_WITH_FALLBACK",
"zone_id": "auto",
"spot_bid_price_percent": 100,
"ebs_volume_count": 0
},
"node_type_id": "r5dn.large",
"driver_node_type_id": "r5dn.large",
"ssh_public_keys": [],
"custom_tags": {
"ResourceClass": "SingleNode"
},
"spark_env_vars": {},
"autotermination_minutes": 120,
"enable_elastic_disk": true,
"init_scripts": [],
"single_user_name": "subahanma@yahoo.com",
"enable_local_disk_encryption": false,
"data_security_mode": "SINGLE_USER",
"runtime_engine": "PHOTON",
"cluster_id": "0614-025944-suhd45l7"
}
However, when I check at View JSON I have below config details, but not do not have the entries in object list to remove easily as you suggested. Would you be able to suggest further. Thanks
{
"cluster_id": "0614-025944-suhd45l7",
"creator_user_name": "subahanma@yahoo.com",
"spark_context_id": 4770338516046754000,
"driver_healthy": true,
"cluster_name": "Abdul Subahan Mohamed Ghouse's Cluster",
"spark_version": "11.3.x-scala2.12",
"spark_conf": {
"spark.master": "local[*, 4]",
"spark.databricks.cluster.profile": "singleNode"
},
"aws_attributes": {
"first_on_demand": 1,
"availability": "SPOT_WITH_FALLBACK",
"zone_id": "auto",
"spot_bid_price_percent": 100
},
"node_type_id": "r5dn.large",
"driver_node_type_id": "r5dn.large",
"custom_tags": {
"ResourceClass": "SingleNode"
},
"autotermination_minutes": 120,
"enable_elastic_disk": true,
"disk_spec": {},
"cluster_source": "UI",
"single_user_name": "subahanma@yahoo.com",
"enable_local_disk_encryption": false,
"instance_source": {
"node_type_id": "r5dn.large"
},
"driver_instance_source": {
"node_type_id": "r5dn.large"
},
"data_security_mode": "SINGLE_USER",
"runtime_engine": "PHOTON",
"effective_spark_version": "11.3.x-photon-scala2.12",
"state": "TERMINATED",
"state_message": "Inactive cluster terminated (inactive for 120 minutes).",
"start_time": 1718333984793,
"terminated_time": 1718343735955,
"last_state_loss_time": 0,
"last_activity_time": 1718334227246,
"last_restarted_time": 1718334277783,
"num_workers": 0,
"default_tags": {
"Vendor": "Databricks",
"Creator": "subahanma@yahoo.com",
"ClusterName": "Abdul Subahan Mohamed Ghouse's Cluster",
"ClusterId": "0614-025944-suhd45l7"
},
"termination_reason": {
"code": "INACTIVITY",
"type": "SUCCESS",
"parameters": {
"inactivity_duration_min": "120"
}
},
"init_scripts_safe_mode": false,
"spec": {
"cluster_name": "Abdul Subahan Mohamed Ghouse's Cluster",
"spark_version": "11.3.x-scala2.12",
"spark_conf": {
"spark.master": "local[*, 4]",
"spark.databricks.cluster.profile": "singleNode"
},
"aws_attributes": {
"first_on_demand": 1,
"availability": "SPOT_WITH_FALLBACK",
"zone_id": "auto",
"spot_bid_price_percent": 100
},
"node_type_id": "r5dn.large",
"driver_node_type_id": "r5dn.large",
"custom_tags": {
"ResourceClass": "SingleNode"
},
"autotermination_minutes": 120,
"enable_elastic_disk": true,
"single_user_name": "subahanma@yahoo.com",
"data_security_mode": "SINGLE_USER",
"runtime_engine": "PHOTON",
"num_workers": 0
}
}
โ06-16-2024 07:27 AM - edited โ06-17-2024 05:40 AM
I recreated my compute cluster again under DBA Academy policy as below but still the same error.
{
"id": "2bf80b86-1f70-47c2-af4b-3bad17cca47b",
"pipeline_type": "WORKSPACE",
"clusters": [
{
"label": "default",
"node_type_id": "i3.xlarge",
"driver_node_type_id": "i3.xlarge",
"policy_id": "0001AF378F789E44",
"num_workers": 0
},
{
"label": "maintenance",
"policy_id": "0001AF378F789E44"
}
],
"development": true,
"continuous": false,
"channel": "CURRENT",
"photon": true,
"libraries": [
{
"notebook": {
"path": "/Shared/data-engineering-with-databricks/DE 4 - Delta Live Tables/DE 4.1A - SQL Pipelines/DE 4.1.1 - Orders Pipeline"
}
},
{
"notebook": {
"path": "/Shared/data-engineering-with-databricks/DE 4 - Delta Live Tables/DE 4.1A - SQL Pipelines/DE 4.1.2 - Customers Pipeline"
}
},
{
"notebook": {
"path": "/Shared/data-engineering-with-databricks/DE 4 - Delta Live Tables/DE 4.1A - SQL Pipelines/DE 4.1.3 - Status Pipeline"
}
}
],
"name": "subahanma-91f8-da-delp-pipeline-demo-pipeline_demo: Example Pipeline",
"edition": "ADVANCED",
"storage": "dbfs:/mnt/dbacademy-users/subahanma@yahoo.com/data-engineer-learning-path/pipeline_demo/storage_location",
"configuration": {
"spark.master": "local[*]",
"source": "dbfs:/mnt/dbacademy-users/subahanma@yahoo.com/data-engineer-learning-path/pipeline_demo/stream-source"
},
"target": "subahanma_91f8_da_delp_pipeline_demo",
"data_sampling": false
}
โ06-17-2024 05:42 AM
I am unable to resolve this. Any help would be appreciated. Thanks
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group