Starting with DBR 17 running Spark 4.0, spark.sql.ansi.enabled is set to true by default. With the flag enabled, strings are implicitly converted to numbers in a very dangerous manner. Consider
SELECT 123='123';
SELECT 123='123X';
The first one is successful, but the second fails with a hard error when trying to convert '123X' to an INT. This poses a serious danger in setups where data might come from different sources and might unintentionally have numbers in string typed columns. If a non-number is introduced in the data at some point, data pipelines that previously worked suddenly crash. When spark.sql.ansi.enabled is true, all potentially failing implicit type conversions should be disallowed, such as implicit casts from STRING to INT.
Also, it seems that on DBR 17.3 (Spark 4.0.0), this fails, but on Spark 4.0.1 it results in NULL:
SELECT '123X'::int;
The Spark 4.0.1 behaviour is definitely more desirable as it makes the '::' cast operator a convenient way to sanitize column types.