I think its not really possible, though I am quite new to Databricks. Here are the types that one can use: Data types | Databricks on AWSThere is a date type and a timestamp type, but it doesn't look like there is something in between. (You could of ...
@David_Billa how about import pyspark.sql.functions as F
df_with_datetime = df.withColumn(
'extracted_datetime',
F.to_timestamp(
F.concat(
*[F.split_part(F.col('file_name'), F.lit("_"), F.lit(i)) for i in range(-6, -1)]
...