Hi @Srikanthn , Yes, you can change the datatype of a column in a Delta table without losing the delta log. The delta log provides transaction history which is essential for time travel in delta tables.
Here is the method to change the datatype of a column in a Delta table:
python
import org.apache.spark.sql.functions.col
spark.read.table("Employee")
.withColumn("Salary", col("Salary").cast("decimal(10,4)"))
.write.format("delta").mode("overwrite").option("overwriteSchema",true).saveAsTable("Employee")
This method reads the Delta table (managed or external), changes the datatype of the column "Salary" from decimal(5,2) to decimal(10,4), and overwrites the schema of the table.
To check the datatype after changing the column datatype, you can use the following code:
python
spark.read.table("Employee").printSchema()
spark.read.table("Employee").show(truncate = false)
This method also preserves the delta log, so you can time travel and check in what version what changes have been made.