cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Is there a (request-) size limit for the Databricks Rest Api Sql statements?

TheIceBrick
New Contributor III

When inserting rows through the Sql Api (/api/2.0/sql/statements/), when more than a certain number of records (about 25 records with 8 small columns) are included in the statement, the call fails with the error:

"The request could not be processed by the warehouse."

Including fewer records (less than 25 in this case) will cause the call to succeed.

 

"statement": "INSERT OVERWRITE TableName (Columns....) VALUES 
    (Named markers record1 for 8 columns...),
    (Named markers record2 for 8 columns...),
    (Named markers record3 for 8 columns...),
    etc. etc. etc. for more records
"wait_timeout": "10s",
"on_wait_timeout": "CONTINUE",
"parameters": [
{
    "name": "Marker",
    "type": "LONG",
    "value": 1705680165682
},
etc. etc. etc. all parameters for the named markers for all records.
]

 

My question is therefore, what limit am I seemingly hitting here? Note that I am obviously referring to the request, not the response, for which the limits are documented. The total size of the request is around 30 - 40 kb.

3 REPLIES 3

Debayan
Databricks Employee
Databricks Employee

Hi, There are rate limits refer: https://docs.databricks.com/en/resources/limits.html#limits-api-rate-limits , 

Also, there is query text size limit in statement execution, refer: https://docs.databricks.com/api/workspace/statementexecution

Let us know if this helps. 

TheIceBrick
New Contributor III

Hi @Debayan thanks for your response. I am not hitting the rate limits and the query text is only around 30kb, not anywhere near the 16MiB mentioned in the article.

Any other suggestions are certainly welcome!

ChrisCkx
New Contributor II

@TheIceBrick did you find out anything else about this?
I am experiencing exactly the same, I can insert up to 35 rows but break at about 50 rows.
The payload size is 42KB, I am passing parameters for each row.

@Debayan 
This is no where near the 16MiB / 25 MiB limits in the documentation.

One more aspect is that the error doesn't appear under the "Query History" view for the warehouse as it does for any other issues, like bad sql syntax for example.

Effectively this case errors out before it becomes a query in the warehouse.

Is there any other place to check for lower level warehouse errors?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group