Hi @PriyanshuKumar,
Thank you for your question in the Databricks community.
The try_to_timestamp() function is designed to attempt to convert a string to timestamp format based on a specified pattern and return null if it fails to do so. However, in the scenario you provided, it is not returning null but is instead throwing an error due to the fact that the input timestamp string cannot be parsed by the new datetime parser introduced in newer versions of Apache Spark.
One way you can achieve your expected result is to use a try-catch block instead of using try_to_timestamp():
from pyspark.sql.functions import to_timestamp, lit
def try_to_timestamp(timestamp_string, pattern):
try:
return to_timestamp(lit(timestamp_string), pattern)
except:
return None
try_to_timestamp('2019-02-28 23:59:59.000', 'yyyy-MM-dd HH:mm:ss') # returns null
This code defines a new function try_to_timestamp() that wraps the to_timestamp() function within a try-catch block and returns None in the event of an exception, effectively simulating the expected behavior that try_to_timestamp() should have exhibited.