Databricks can install Maven libraries by coordinate and lets you point at a custom repository URL.
However, passing credentials for authenticated private Maven repositories directly through the Libraries UI/Jobs is not natively supported today and requires workarounds; this has been tracked internally as a product ask rather than a GA feature.
But one workaround for your private Maven host, which requires authentication, you can use Apache Ivy settings via init scripts to provide credentials and repository resolution, then let Ivy resolve packages at cluster startup.
For this, you can create an ivysettings.xml file with credentials and point Spark to it; for newer runtimes, you can swap in a patched Ivy JAR to externalise the settings file for multiple repositories with authentication.