Is there a (request-) size limit for the Databricks Rest Api Sql statements?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-19-2024 08:38 AM - edited 01-19-2024 08:38 AM
When inserting rows through the Sql Api (/api/2.0/sql/statements/), when more than a certain number of records (about 25 records with 8 small columns) are included in the statement, the call fails with the error:
"The request could not be processed by the warehouse."
Including fewer records (less than 25 in this case) will cause the call to succeed.
"statement": "INSERT OVERWRITE TableName (Columns....) VALUES
(Named markers record1 for 8 columns...),
(Named markers record2 for 8 columns...),
(Named markers record3 for 8 columns...),
etc. etc. etc. for more records
"wait_timeout": "10s",
"on_wait_timeout": "CONTINUE",
"parameters": [
{
"name": "Marker",
"type": "LONG",
"value": 1705680165682
},
etc. etc. etc. all parameters for the named markers for all records.
]
My question is therefore, what limit am I seemingly hitting here? Note that I am obviously referring to the request, not the response, for which the limits are documented. The total size of the request is around 30 - 40 kb.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-19-2024 08:18 PM
Hi, There are rate limits refer: https://docs.databricks.com/en/resources/limits.html#limits-api-rate-limits ,
Also, there is query text size limit in statement execution, refer: https://docs.databricks.com/api/workspace/statementexecution
Let us know if this helps.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-22-2024 03:54 AM
Hi @Debayan thanks for your response. I am not hitting the rate limits and the query text is only around 30kb, not anywhere near the 16MiB mentioned in the article.
Any other suggestions are certainly welcome!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-19-2024 08:01 AM
@TheIceBrick did you find out anything else about this?
I am experiencing exactly the same, I can insert up to 35 rows but break at about 50 rows.
The payload size is 42KB, I am passing parameters for each row.
@Debayan
This is no where near the 16MiB / 25 MiB limits in the documentation.
One more aspect is that the error doesn't appear under the "Query History" view for the warehouse as it does for any other issues, like bad sql syntax for example.
Effectively this case errors out before it becomes a query in the warehouse.
Is there any other place to check for lower level warehouse errors?

