[UNRESOLVED_ROUTINE] Cannot resolve function `date_format`
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
Hi all,
We are getting the following error log in a Workflow:
AnalysisException: [UNRESOLVED_ROUTINE] Cannot resolve function `date_format` on search path [`system`.`builtin`, `system`.`session`]. SQLSTATE: 42883
The Workflow consists in different notebooks, each of the notebooks runs with the same cluster in order to save idle time of the cluster. When we trigger the workflow, the first task runs successfully but when the second one starts it always fails at the same point with the error log shared above.
This is the configuration that we are using:
- Databricks RT version: 14.3 LTS (Scala 2.12, Spark 3.5.0)
- Unity Catalog enabled
We are importing the library org.apache.spark.sql.functions._ to use the date_format and other spark functions.
Have you ever encounter this issue? Any known solution for this?
- Labels:
-
Workflows
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
Hi Jaclaglez13,
How are you doing today?, As per my understanding, yeah, this issue usually happens when Unity Catalog affects function resolution across different tasks in a Workflow. Since your first task runs fine but the second one fails, it’s likely that the session loses access to built-in SQL functions when switching notebooks. A few things to try: use the fully qualified function name (org.apache.spark.sql.functions.date_format in Python or system.builtin.date_format in SQL) to ensure proper resolution, restart the cluster before the second task to refresh the session, or check if the function is available by running SHOW FUNCTIONS LIKE 'date_format';. If the function isn’t listed, the session might not be resolving built-in functions correctly. These should help get things running smoothly—let me know what works!
Regards,
Brahma

