โ03-09-2022 04:43 AM
Hello! I have the following problem. I want to save the delta table and that table contains timestamp columns, but when I try to write that table with spark the timestamp columns become timestamp with the time zone. This is a problem in my case because when I try to read the table from Trino the columns are in timestamp with time zone format. Does anyone know how I can change this behavior?
โ03-09-2022 05:00 AM
in Spark sql "Spark SQL defines the timestamp type as
TIMESTAMP WITH SESSION TIME ZONE"
you can check your timezone settings. Sometimes in such a case can be more comfortable to store timestamp as Long.
Here is more info about timezones https://docs.databricks.com/spark/latest/dataframes-datasets/dates-timestamps.html
If you share example file and code I could help more.
โ03-09-2022 05:00 AM
in Spark sql "Spark SQL defines the timestamp type as
TIMESTAMP WITH SESSION TIME ZONE"
you can check your timezone settings. Sometimes in such a case can be more comfortable to store timestamp as Long.
Here is more info about timezones https://docs.databricks.com/spark/latest/dataframes-datasets/dates-timestamps.html
If you share example file and code I could help more.
โ03-09-2022 05:13 AM
@Hubert Dudekโ This is happening if I use Delta. If I use parquet spark save the column in timestamp without time zone.
โ06-22-2023 05:12 AM
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now