Databricks and Unity Catalog do not enforce a specific data model like Data Vault. The default is a Lakehouse architecture using Delta Lake, which supports flexible schemas, ACID transactions, and schema evolution.
Unity Catalog organizes data into metastores, catalogs, schemas, and tables, allowing implementation of various models like Data Vault, Star Schema, or 3NF, depending on the use case.
For large-scale, auditable systems, Data Vault is a good fit and can leverage Delta Lake features for incremental loads and schema enforcement.
Refer to the Databricks Lakehouse Documentation and Unity Catalog Overview for more details.