Hi everyone,
We are currently migrating to Unity Catalog and also our CD pipeline from DBX to Assets Bundle.
Reading the documentation beforehand, I was totally sure that Databricks would support deploying jobs directly to (external) volumes as it would make sense from a design perspective. But after testing, I found that this does not seem to be the case. Maybe I'm missing something...
What I've tried so far:
- workspace.root_dir: "/Volumes/<catalog>/<db>/code/${bundle.name}" => This creates a folder named "Volumes" in the workspace fs
- workspace.root_path: "s3://<s3-bucket>/volumes/code/${bundle.name}" => Error: unable to mkdir to write file s3:/<s3-bucket>/volumes/code/analytics_backend/state/deploy.lock: Path (s3:/<s3-bucket>/volumes/code/analytics_backend/state) doesn't start with '/'
Is there a way to directly deploy to an external volume via DAB? DBX for example supported deployments to abffs locations.
Thanks in advance!
David