cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Pyspark datatype missing microsecond precision last three SSS: h:mm:ss:SSSSSS - datetype

jimbo
New Contributor II

Hi all,

We are having issues with the datetype data type in spark when ingesting files.

Effectively the source data has 6 microseconds worth of precision but the most we can extract from the datatype is three. For example 12:03:23.123, but what is required is 12:03:23.123456. The source file has this precision but when the file is ingested. Here is an example:

df.select(to_timestamp("date_col", "yyyy-MM-dd").alias("date"), to_timestamp("timestamp_col", "yyyy-MM-dd HH:mm:ss.SSS").alias("timestamp")).show(truncate=False


|2022-03-16 12:34:56.789|
|2022-03-16 01:23:45.678|

the requirement is for |2022-03-16 12:34:56.456789.

What is the best way to do this?

Many thanks

Jay

1 REPLY 1

Kaniz_Fatma
Community Manager
Community Manager

Hi @jimbo, Handling high-precision timestamps in Spark can be tricky, especially when you need to preserve microsecond-level precision. 

 

Letโ€™s explore some strategies to achieve your desired timestamp format.

 

Custom Format String:

  • Youโ€™re currently using the format string "yyyy-MM-dd HH:mm:ss.SSS" to parse the timestamp. To retain microsecond precision, you can extend the format string to include milliseconds and microseconds.

Using Unix Timestamps:

  • If youโ€™re dealing with high-precision timestamps, consider using Unix timestamps (seconds since the epoch) instead of formatted strings.

Proleptic Gregorian Calendar:

  • Starting from Spark 3.0, the Proleptic Gregorian calendar is used for internal operations on timestamps. This ensures consistent behaviour across different dates and years.
  • Be aware that some dates that existed in Spark 2.4 may not exist in Spark 3.0 due to differences in calendars. For example, 1000-02-29 is not a valid date in the Gregorian calendar.
  • Ensure that your Spark version is 3.0 or later to benefit from these improvements.

Remember to adjust your approach based on your specific use case and requirements. If you need to handle timestamps with microsecond precision, the custom format string or Unix timestamps should serve you well! ๐Ÿ•ฐ๏ธโฑ๏ธ

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group