Databricks Materialized View - DLT Serverless Incremental
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-06-2024 05:16 AM
I'm currently working with Delta Live Tables, utilizing materialized views and serverless computing. While testing the incremental updates of materialized views, I've observed that deleting a record from the source table triggers a complete refresh of the materialized view. Upon reviewing the materialized view logs, I discovered this behavior:
message: Flow 'teste_novo_2' has been planned in DLT to be executed as COMPLETE_RECOMPUTE. Another option is available:ROW_BASED. COMPLETE_RECOMPUTE was chosen in the current run for its optimal performance.
details:
{"planning_information":{"technique_information":[{"incrementalization_issues":[{"issue_type":"INCREMENTAL_PLAN_REJECTED_BY_COST_MODEL","prevent_incrementalization":true,"cost_model_rejection_subtype":"CHANGESET_SIZE_THRESHOLD_EXCEEDED"}]},{"maintenance_type":"MAINTENANCE_TYPE_COMPLETE_RECOMPUTE","is_chosen":true,"is_applicable":true,"cost":4.75717377E8},{"maintenance_type":"MAINTENANCE_TYPE_ROW_BASED","is_chosen":false,"is_applicable":true,"cost":1.18965781E8}],"source_table_information":[{"table_name":"`mul_dev_tests`.`dlt_managed`.`tabela_teste_4`","table_id":"c7147455-60dd-442b-924e-ca103c82f026","full_size":4.79009539E8,"change_size":1.19579857E8,"is_size_after_pruning":true,"is_row_id_enabled":true,"is_cdf_enabled":true,"is_deletion_vector_enabled":true,"is_change_from_legacy_cdf":false}],"target_table_information":{"table_name":"`mul_dev_tests`.`dlt_managed`.`teste_novo_2`","table_id":"bfc03f09-286d-4da1-bf71-32f80ebf1739","full_size":3.42395821E8,"is_row_id_enabled":true,"is_cdf_enabled":true,"is_deletion_vector_enabled":true}}}
What does the CHANGESET_SIZE_THRESHOLD_EXCEEDED error mean? Is there a resource that lists all incrementalization errors? Additionally, is there a method to handle deletions incrementally in materialized views?
obs: i tested delete only one line, and 3k lines, the same happens.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-06-2024 05:27 AM
Same problem here
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-06-2024 06:37 AM
EDIT: My Delta Lake table contains 136 columns. I initially tested with fewer columns, and both the updates and deletes were applied incrementally without issues. Specifically, I tested with 34 columns, and everything worked fine. However, when I increased the number of columns to 40, I encountered the error mentioned above.

