Hi pooja_bhumandla,
How are you doing today? In general, changing the delta.targetFileSize config while a batch or streaming load is in progress wonโt crash your job, but it may lead to inconsistent behavior during that specific run. Spark jobs usually pick up the config at the start of execution, so if you unset or change it mid-load, only the subsequent operations or triggers might reflect the new setting. This could result in a mix of file sizes being written to Delta, which might not be ideal if youโre aiming for consistent file sizing. So, while it's not likely to fail your job outright, itโs best to change such configurations between loads or restarts to avoid unexpected results. Hope that helps!
Regards,
Brahma