cancel
Showing results for 
Search instead for 
Did you mean: 
missing-QuestionPost
cancel
Showing results for 
Search instead for 
Did you mean: 

When creating a cluster, ERROR: "Missing required field 'spark_version'" even though it was specified

MattJohn_35453
New Contributor III

Every time I try to create a cluster over the API I get this. Even when I am literally using the example JSON from the API documentation:

```

{

"cluster_name": "single-node-cluster",

"node_type_id": "i3.xlarge",

"spark_version": "7.6.x-scala2.12",

"num_workers": 0,

"custom_tags": {

"ResourceClass": "SingleNode"

},

"spark_conf": {

"spark.databricks.cluster.profile": "singleNode",

"spark.master": "[*, 4]"

}

}

```

I thought maybe that example spark version was too old, so I set it to 13.0 and that made no difference.

This is being submitted using the Python Databricks-CLI class ApiClient.

Full error text:

```

HTTPError: 400 Client Error: Bad Request for url: https://mothership-production.cloud.databricks.com/api/2.0/clusters/create

Response from server:

{ 'details': [ { '@type': 'type.googleapis.com/google.rpc.ErrorInfo',

'domain': '',

'reason': 'CM_API_ERROR_SOURCE_CALLER_ERROR'}],

'error_code': 'INVALID_PARAMETER_VALUE',

'message': "Missing required field 'spark_version'"}

```

In the python debugger, it is using the method `databricks_cli/sdk/api_client.py(174)perform_query()`.

If I have the debugger spit out theparameters for that "query", I get:

data = {'num_workers': '{"cluster_name": "single-node-cluster", "node_type_id": "i3.xlarge", "spark_version": "7.6.x-scala2.12", "num_workers": 0, "custom_tags": {"ResourceClass": "SingleNode"}, "spark_conf": {"spark.databricks.cluster.profile": "singleNode", "spark.master": "[*, 4]"}}'}

path = '/clusters/create'

headers = {'Authorization': 'Bearer [REDACTED]', 'Content-Type': 'text/json', 'user-agent': 'databricks-cli-0.17.7-'}

method = 'POST'

1 ACCEPTED SOLUTION

Accepted Solutions

MattJohn_35453
New Contributor III

Well, I feel dumb.

I tried the databricks-cli one again, but instead of passing JSON, I did `**cluster_spec` so it would expand the dict as separate parameters and that seemed to work.

I tried the same on the databricks-sdk one and that didn't seem to make a difference, so I have to fgure that out since the CLI one is deprecated now, I guess.

View solution in original post

3 REPLIES 3

MattJohn_35453
New Contributor III

Tried using the databricks-sdk instead and just get an error on a different parameter -

"DatabricksError: Exactly 1 of virtual_cluster_size, num_workers or autoscale must be specified."

But, as you can see below, it definitely has "num_workers". And again, that JSON is literally pulled from the API docs....

```

cluster_spec={

"cluster_name": "single-node-cluster",

"node_type_id": "i3.xlarge",

"spark_version": "7.6.x-scala2.12",

"num_workers": 0,

"custom_tags": {

"ResourceClass": "SingleNode"

},

"spark_conf": {

"spark.databricks.cluster.profile": "singleNode",

"spark.master": "[*, 4]"

}

}

from databricks.sdk import WorkspaceClient

import json

dbrix_environment = 'production'

dbrix_host = f"https://XXXXXX-{dbrix_environment}.cloud.databricks.com"

dbrix_token = dbutils.secrets.get(scope = 'funstuff', key = 'choo-choo')

w = WorkspaceClient(

host = dbrix_host,

token = dbrix_token

)

w.clusters.create(json.dumps(cluster_spec))

```

MattJohn_35453
New Contributor III

Well, I feel dumb.

I tried the databricks-cli one again, but instead of passing JSON, I did `**cluster_spec` so it would expand the dict as separate parameters and that seemed to work.

I tried the same on the databricks-sdk one and that didn't seem to make a difference, so I have to fgure that out since the CLI one is deprecated now, I guess.

Norway
New Contributor II

Hi, Matt. Would be super useful if you could provide a more complete code example on how it worked.

Paal
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.