- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-12-2023 12:16 AM
Hi all,
I'm trying to get a date out of the columns year and week. The week format is not recognized.
df_loaded = df_loaded.withColumn("week_year", F.concat(F.lit("3"),F.col('Week'), F.col('Jaar')))
df_loaded = df_loaded.withColumn("date", F.to_date(F.col("week_year"), "uwwyyyy"))
I'm getting this error:
But it doesn't work and I get this error:
SparkUpgradeException: You may get a different result due to the upgrading of Spark 3.0: Fail to recognize 'uwwyyyy' pattern in the DateTimeFormatter. 1) You can set spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before Spark 3.0. 2) You can form a valid datetime pattern with the guide from https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
Any ideas how to get from columns week & year to date in pyspark?
- Labels:
-
Year