cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Create new job api error "MALFORMED_REQUEST"

tum
New Contributor II

hi,

i'm trying to test create a new job api (v 2.1) with python, but i got error:

{ 'error_code': 'MALFORMED_REQUEST',

 'message': 'Invalid JSON given in the body of the request - expected a map'}

How do i validate json body before posting ?

this is my json

 {
            "name": "testing job",
            "new_cluster": {
                "spark_version": "7.3.x-scala2.12",
                "node_type_id": "r3.xlarge",
                "aws_attributes": {
                "availability": "ON_DEMAND"
                },
                "num_workers": 10
            },
            "email_notifications": {
                "on_start": [],
                "on_success": [],
                "on_failure": []
            },
            "timeout_seconds": 3600,
            "max_retries": 1,
            "schedule": {
                "quartz_cron_expression": "0 15 22 * * ?",
                "timezone_id": "America/Los_Angeles"
            }
}

3 REPLIES 3

karthik_p
Esteemed Contributor

@tum m​ It looks Json key-value pairs that you are using seems to be causing issue. spark version is individual key-value pair. that is not a part of object itself (e.g.: you used new cluster and under that you added spark version), please find below examples. as far as i know there is no validation tool to show if object mapping added is right, but syntax wise we have tools like json validator

Clusters API 2.0 | Databricks on AWS

tum
New Contributor II

thanks 611969, I checked my code again and found some issues with string in json body

Jobs API 2.0 >> https://docs.databricks.com/dev-tools/api/2.0/jobs.html#create

Anonymous
Not applicable

Hi @tum m​ 

Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!