cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

try_to_timestamp not working as expected

PriyanshuKumar
New Contributor

try_to_timestamp should return NULL for following expression but it is throwing error:

 

try_to_timestamp(
           '2019-02-28 23:59:59.000',
          'yyyy-MM-dd HH:mm:ss'
        )

 

I understand above expression is wrong as the date format does not confirm to the datetime literal, but I expect a null instead of INCONSISTENT_BEHAVIOR_CROSS_VERSION.PARSE_DATETIME_BY_NEW_PARSER error.

1 REPLY 1

Kumaran
Databricks Employee
Databricks Employee

Hi @PriyanshuKumar,

Thank you for your question in the Databricks community.

The try_to_timestamp() function is designed to attempt to convert a string to timestamp format based on a specified pattern and return null if it fails to do so. However, in the scenario you provided, it is not returning null but is instead throwing an error due to the fact that the input timestamp string cannot be parsed by the new datetime parser introduced in newer versions of Apache Spark.

One way you can achieve your expected result is to use a try-catch block instead of using try_to_timestamp():

 

 

from pyspark.sql.functions import to_timestamp, lit

def try_to_timestamp(timestamp_string, pattern):
  try:
    return to_timestamp(lit(timestamp_string), pattern)
  except:
    return None

try_to_timestamp('2019-02-28 23:59:59.000', 'yyyy-MM-dd HH:mm:ss') # returns null

 

This code defines a new function try_to_timestamp() that wraps the to_timestamp() function within a try-catch block and returns None in the event of an exception, effectively simulating the expected behavior that try_to_timestamp() should have exhibited.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group