- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-09-2022 04:43 AM
Hello! I have the following problem. I want to save the delta table and that table contains timestamp columns, but when I try to write that table with spark the timestamp columns become timestamp with the time zone. This is a problem in my case because when I try to read the table from Trino the columns are in timestamp with time zone format. Does anyone know how I can change this behavior?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-09-2022 05:00 AM
in Spark sql "Spark SQL defines the timestamp type as
TIMESTAMP WITH SESSION TIME ZONE"
you can check your timezone settings. Sometimes in such a case can be more comfortable to store timestamp as Long.
Here is more info about timezones https://docs.databricks.com/spark/latest/dataframes-datasets/dates-timestamps.html
If you share example file and code I could help more.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-09-2022 05:00 AM
in Spark sql "Spark SQL defines the timestamp type as
TIMESTAMP WITH SESSION TIME ZONE"
you can check your timezone settings. Sometimes in such a case can be more comfortable to store timestamp as Long.
Here is more info about timezones https://docs.databricks.com/spark/latest/dataframes-datasets/dates-timestamps.html
If you share example file and code I could help more.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-09-2022 05:13 AM
@Hubert Dudek This is happening if I use Delta. If I use parquet spark save the column in timestamp without time zone.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-22-2023 05:12 AM
Hi @Hubert Dudek ,
When you have time, I describe my problem. Can you please check?

