The issue can happen if the Hive syntax for table creation is used instead of the Spark syntax.
Read more here: https://docs.databricks.com/spark/latest/spark-sql/language-manual/sql-ref-syntax-ddl-create-table-h...
The issue mentioned in the error message is already fixed on the Hive metastore layer. However, the default version of the hive metastore client used by Databricks is an older version and the fix is not available on that version. Upgrade the metastore client to use a newer version of the Hive metastore client. (1.2.1 or greater)