Hi @Rahul Lalwani, The issue you're facing with the timestamp’ 1753-01-01 00:00:00’ is related to a limitation in the date range supported by Spark. Spark's default date range starts from 1900, and any earlier date may not be handled accurately.
You can use a different timestamp within Spark's supported date range to work around this limitation. Here's an updated version of your query that uses ’1900-01-01 00:00:00’ instead of ’1753-01-01 00:00:00’:
spark.conf.set("spark.sql.session.timeZone", "Australia/Sydney")
SELECT
'1900-01-01 00:00:00' AS original_timestamp,
to_utc_timestamp('1900-01-01 00:00:00', 'Australia/Sydney') AS to_utc_from_aest,
from_utc_timestamp(to_utc_timestamp('1900-01-01 00:00:00', 'Australia/Sydney'), 'Australia/Sydney') AS from_utc_to_aest
UNION ALL
SELECT
'2023-01-01 00:00:00',
to_utc_timestamp('2023-01-01 00:00:00', 'Australia/Sydney'),
from_utc_timestamp(to_utc_timestamp('2023-01-01 00:00:00', 'Australia/Sydney'), 'Australia/Sydney')
By using ’1900-01-01 00:00:00’, you should see the first and third columns showing the same value in the result.