Hi folks!
I'm trying to read Iceberg tables that I created in Snowflake from Databricks using catalog federation. I set up a connection to Snowflake, configured an external location pointing to the S3 folder that contains the Iceberg files, and used that to create a catalog.
The first time, I'm able to query the Iceberg table from Databricks successfully. However, if I update the table in Snowflake and then try to query it again from Databricks, I get the following error:
Failed to convert the table version 2 to the universal format iceberg. Clone validation failed - Size and number of data files in target table should match with source table. srcTableSize: 168818176, targetTableSize: 314383872 srcTableNumFiles: 25, targetTableNumFiles: 25 SQLSTATE: KD00E
The table contains around 5 million rows.