You can try to add to your cluster mvn dependency manually ... For example, for spark 3.5.x it will be like:
io.graphframes:graphframes-spark3_2.12:0.10.0
and add a PyPi dependency graphframes-py. Adding maven coordinates should download and install all the JVM dependencies.
But most probably it won't work on DBR ML runtimes because you will have in CP two differently named graphframes JARs, but with the same namespace and barely anyone will tell you how it will be resolved in runtime... I think the best way is just using generic runtime instead of DBR ML.