cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

convert string dataframe column MM/dd/yyyy hh:mm:ss AM/PM to timestamp MM-dd-yyyy hh:mm:ss

Venkata_Krishna
New Contributor

How to convert string 6/3/2019 5:06:00 AM to timestamp in 24 hour format MM-dd-yyyy hh:mm:ss in python spark.

1 REPLY 1

lee
Contributor

You would use a combination of the functions:

pyspark.sql.functions.from_unixtime(timestamp, format='yyyy-MM-dd HH:mm:ss')

(documentation) and

pyspark.sql.functions.unix_timestamp(timestamp=None, format='yyyy-MM-dd HH:mm:ss')

(documentation)

from pyspark.sql.types import *
from pyspark.sql.functions import unix_timestamp, from_unixtime
df = spark.createDataFrame(["6/3/2019 5:06:00 AM"], StringType()).toDF("ts_string")
# convert to timestamp type
df1 = df.select(from_unixtime(unix_timestamp('ts_string', 'MM/dd/yyyy hh:mm:ss a')).cast(TimestampType()).alias("timestamp"))
# change timestamp format

df2 = df1.select(from_unixtime(unix_timestamp('timestamp', 'MM-dd-yyyy hh:mm:ss')).alias("timestamp2"))
# all together
df3 = df.select(
  'ts_string',
  from_unixtime(unix_timestamp('ts_string', 'MM/dd/yyyy hh:mm:ss a')).cast(TimestampType()).alias("timestamp"),
  from_unixtime(unix_timestamp(from_unixtime(unix_timestamp('ts_string', 'MM/dd/yyyy hh:mm:ss a')).cast(TimestampType()), 'MM-dd-yyyy hh:mm:ss')).alias("timestamp2")
)

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.