Is there a way to use Compute Policies to force Delta Live Tables to use specific Databricks Runtime and PySpark versions? While trying to leverage some of the functions in PySpark 3.5.0, I don't seem to be able to get Delta Live Tables to use Databricks Runtime 14.0/14.1. Cluster Policy simply has version in it, for example
{
"spark_version": {
"type": "regex",
"pattern": "^14\\..*"
}
}
When testing that this forces versions in normal compute creation shows that it works correctly.
However, a Delta Live Tables pipeline fails when it doesn't find functions in PySpark 3.5.0 and when checking what version that compute uses, it suggests it's running Databricks Runtime 12.2 (PySpark 3.3).
Does forcing versions using Compute Policies not work for Delta Live Tables, and if not is there another way to influence what Databricks Runtime is used? Another use case we will likely have is using ML clusters to get Graphframes pre-loaded, so would like to ensure that's possible.