I have this datetime string in my dataset: '2023061218154258' and I want to convert it to datetime, using below code. However the format that I expect to work, doesn't work, namely: yyyyMMddHHmmssSS. This code will reproduce the issue:
from pyspark.sql.functions import *
spark.conf.set("spark.sql.legacy.timeParserPolicy","CORRECTED")
# If the config is set to CORRECTED then the conversion will return null instead of throwing an exception.
df=spark.createDataFrame(
data=[ ("1", "2023061218154258")
, ("2", "20230612181542.58")]
,schema=["id","input_timestamp"])
df.printSchema()
#Timestamp String to DateType
1. df.withColumn("timestamp",to_timestamp("input_timestamp", format = 'yyyyMMddHHmmssSS')).show(truncate=False)
df.withColumn("timestamp",to_timestamp("input_timestamp", format = 'yyyyMMddHHmmss.SS')).show(truncate=False)
output:
+---+-----------------+---------+
|id |input_timestamp |timestamp|
+---+-----------------+---------+
|1 |2023061218154258 |null |
|2 |20230612181542.58|null |
+---+-----------------+---------+
+---+-----------------+----------------------+
|id |input_timestamp |timestamp |
+---+-----------------+----------------------+
|1 |2023061218154258 |null |
|2 |20230612181542.58|2023-06-12 18:15:42.58|
+---+-----------------+----------------------+
I tried to_timestamp with the format yyyyMMddHHmmssSS and I expected that it would convert the string 2023061218154258 into the timestamp 2023-06-12 18:15:42.58
When I change the line
spark.conf.set("spark.sql.legacy.timeParserPolicy","CORRECTED")
into
spark.conf.set("spark.sql.legacy.timeParserPolicy","LEGACY") the issue is solved, but I don't want to use legacy mode (because it gives other issues).