- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-02-2025 04:28 PM
Hello,
What is the best practice in Databricks for storing DateTime data in silver layer tables, considering the source data is in AEST and we store it in UTC by default?
Thanks
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2025 12:37 AM
Hi @Freshman ,
By converting you datetime columns from AEST to UTC and storing as UTC you are following best practices.
This way you standardize the the time zone and avoid confusion, in case you will have more data from multiple time zones.
When you need to display the data in a local time zone (like AEST for end users), perform the conversion from UTC to the desired time zone in your reporting layer.
Additionally, as per documentation, Databricks does not have any datetime format that would contain the information about the time zone. You need to use TIMESTAMP, so converting to UTC before storing in silver is you safest bet.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2025 12:39 AM
@Freshman wrote:Hello,
What is the best practice in Databricks for storing DateTime data in silver layer tables, considering the source data is in AEST and we store it in UTC by default?
Thanks
The best practice for storing DateTime data in the Silver layer of a Databricks data lake is to store it in UTC and include a separate column for the original timezone. This ensures data consistency and allows for accurate time zone conversions during analysis. Utilize Databricks SQL functions like from_utc_timestamp() and to_utc_timestamp() for efficient time zone handling.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2025 12:37 AM
Hi @Freshman ,
By converting you datetime columns from AEST to UTC and storing as UTC you are following best practices.
This way you standardize the the time zone and avoid confusion, in case you will have more data from multiple time zones.
When you need to display the data in a local time zone (like AEST for end users), perform the conversion from UTC to the desired time zone in your reporting layer.
Additionally, as per documentation, Databricks does not have any datetime format that would contain the information about the time zone. You need to use TIMESTAMP, so converting to UTC before storing in silver is you safest bet.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2025 02:45 PM
Thanks @filipniziol for you advice, appreciate it, that gives confidence in our solution design. yes we are do the conversion in the Reporting/Gold layer.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2025 12:39 AM
@Freshman wrote:Hello,
What is the best practice in Databricks for storing DateTime data in silver layer tables, considering the source data is in AEST and we store it in UTC by default?
Thanks
The best practice for storing DateTime data in the Silver layer of a Databricks data lake is to store it in UTC and include a separate column for the original timezone. This ensures data consistency and allows for accurate time zone conversions during analysis. Utilize Databricks SQL functions like from_utc_timestamp() and to_utc_timestamp() for efficient time zone handling.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-03-2025 02:45 PM
Thanks @robert154 for you advice, appreciate it, that gives confidence in our solution design. yes we are do the conversion in the Reporting/Gold layer.

