I have ran into a weird situation, so I uploaded few parquet files (about 10) for my sales data into the Volume in my catalog, and run dbt againt it , dbt went successful and table was able to be created however when i upload a lot more parquet files it just fails, and says
[TASK_WRITE_FAILED] Task failed while writing rows to abfss://datalake-raw-dev@xxxx.dfs.core.windows.net/__unitystorage/schemas/xxxx/tables/zzzz. SQLSTATE: 58030
why is it that for small amount of parquet it can be processed while a large amount of parquet(about 2500) it does not work.