I have a float dataype column in delta table and data to be loaded should be rounded off to 2 decimal places. I'm casting the column to DECIMAL(18,10) type and then using round function from pyspark.sql.function for rounding off values to 2 decimal places. When I display the dataframe before loading into delta table, I'm getting the desired 2 decimal place values, but after loading into the table, I'm getting values for that column upto 15 decimal places. Is this expected behavior in delta table or some cluster configurations needs to be changed for the same?