Delta Sharing CDF API error: "RESOURCE_LIMIT_EXCEEDED"
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-13-2023 12:14 AM
Hi,
When attempting to read a particular version from the Databricks Delta Sharing CDF (Change Data Feed) API, even when that version contains only one data file, an error occurs due to a timeout with following message:
"errorCode": "RESOURCE_LIMIT_EXCEEDED",
"message": "A timeout occurred when processing the table after 3 updates across 1 iterations. If it continues to happen, please contact your data provider and request them to optimize their table so that it has fewer files."
"errorCode": "RESOURCE_LIMIT_EXCEEDED",
"message": "A timeout occurred when processing the table after 3 updates across 1 iterations. If it continues to happen, please contact your data provider and request them to optimize their table so that it has fewer files."
API:
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-13-2023 01:01 AM
Hi Data_Analytics1
Use Optimize on your delta tables. Refer https://docs.databricks.com/en/sql/language-manual/delta-optimize.html
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-13-2023 01:05 AM
Hi @MaxGendu ,
I executed OPTIMIZE command on the delta table but it only optimizes or merge the data files of current snapshot. Version data files are not get affected or optimized by this OPTIMIZE command.
Even the specific version is only having one data file so even after optimizing it will still be the same.