cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Is there a (request-) size limit for the Databricks Rest Api Sql statements?

TheIceBrick
New Contributor III

When inserting rows through the Sql Api (/api/2.0/sql/statements/), when more than a certain number of records (about 25 records with 8 small columns) are included in the statement, the call fails with the error:

"The request could not be processed by the warehouse."

Including fewer records (less than 25 in this case) will cause the call to succeed.

 

"statement": "INSERT OVERWRITE TableName (Columns....) VALUES 
    (Named markers record1 for 8 columns...),
    (Named markers record2 for 8 columns...),
    (Named markers record3 for 8 columns...),
    etc. etc. etc. for more records
"wait_timeout": "10s",
"on_wait_timeout": "CONTINUE",
"parameters": [
{
    "name": "Marker",
    "type": "LONG",
    "value": 1705680165682
},
etc. etc. etc. all parameters for the named markers for all records.
]

 

My question is therefore, what limit am I seemingly hitting here? Note that I am obviously referring to the request, not the response, for which the limits are documented. The total size of the request is around 30 - 40 kb.

3 REPLIES 3

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, There are rate limits refer: https://docs.databricks.com/en/resources/limits.html#limits-api-rate-limits , 

Also, there is query text size limit in statement execution, refer: https://docs.databricks.com/api/workspace/statementexecution

Let us know if this helps. 

TheIceBrick
New Contributor III

Hi @Debayan thanks for your response. I am not hitting the rate limits and the query text is only around 30kb, not anywhere near the 16MiB mentioned in the article.

Any other suggestions are certainly welcome!

ChrisCkx
New Contributor II

@TheIceBrick did you find out anything else about this?
I am experiencing exactly the same, I can insert up to 35 rows but break at about 50 rows.
The payload size is 42KB, I am passing parameters for each row.

@Debayan 
This is no where near the 16MiB / 25 MiB limits in the documentation.

One more aspect is that the error doesn't appear under the "Query History" view for the warehouse as it does for any other issues, like bad sql syntax for example.

Effectively this case errors out before it becomes a query in the warehouse.

Is there any other place to check for lower level warehouse errors?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.