โ03-09-2022 04:43 AM
Hello! I have the following problem. I want to save the delta table and that table contains timestamp columns, but when I try to write that table with spark the timestamp columns become timestamp with the time zone. This is a problem in my case because when I try to read the table from Trino the columns are in timestamp with time zone format. Does anyone know how I can change this behavior?
โ03-09-2022 05:00 AM
in Spark sql "Spark SQL defines the timestamp type as
TIMESTAMP WITH SESSION TIME ZONE"
you can check your timezone settings. Sometimes in such a case can be more comfortable to store timestamp as Long.
Here is more info about timezones https://docs.databricks.com/spark/latest/dataframes-datasets/dates-timestamps.html
If you share example file and code I could help more.
โ03-09-2022 05:00 AM
in Spark sql "Spark SQL defines the timestamp type as
TIMESTAMP WITH SESSION TIME ZONE"
you can check your timezone settings. Sometimes in such a case can be more comfortable to store timestamp as Long.
Here is more info about timezones https://docs.databricks.com/spark/latest/dataframes-datasets/dates-timestamps.html
If you share example file and code I could help more.
โ03-09-2022 05:13 AM
@Hubert Dudekโ This is happening if I use Delta. If I use parquet spark save the column in timestamp without time zone.
โ04-27-2022 01:52 AM
Hi, @Borislav Blagoevโ , Just a friendly follow-up. Do you still need help, or @Hubert Dudek (Customer)โ 's response help you to find the solution? Please let us know.
โ06-22-2023 05:12 AM
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group