I am facing the same issue on DBR 14.3 and the beta of 15.4.
My cluster is using the "Unrestricted" policy and "Single user" access mode set a user which has permission to read and write to the volume. I tested the permissions by writing a small dataframe to my desired checkpoint folder directly (with .write instead of .setCheckpointDir followed by .checkpoint) and did not get the error. The exception is only raised when setting the volume as Spark's checkpoint directory.
Here is a bit more of the stack trace when calling .setCheckpointDir on a Unity catalog volume.
java.io.IOException: Operation not permitted
at java.io.UnixFileSystem.canonicalize0(Native Method)
at java.io.UnixFileSystem.canonicalize(UnixFileSystem.java:177)
at java.io.File.getCanonicalPath(File.java:626)
at java.io.File.getCanonicalFile(File.java:651)
at org.apache.spark.util.SparkFileUtils.resolveURI(SparkFileUtils.scala:49)
at org.apache.spark.util.SparkFileUtils.resolveURI$(SparkFileUtils.scala:33)
at org.apache.spark.util.Utils$.resolveURI(Utils.scala:105)
...
What storage solution is recommended for setting the cluster checkpoint directory in Databricks, if not Unity volumes?