cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

DAB Deployment to a (external) volume on UC

breaka
New Contributor III

Hi everyone,

We are currently migrating to Unity Catalog and also our CD pipeline from DBX to Assets Bundle.

Reading the documentation beforehand, I was totally sure that Databricks would support deploying jobs directly to (external) volumes as it would make sense from a design perspective. But after testing, I found that this does not seem to be the case. Maybe I'm missing something...

What I've tried so far:

- workspace.root_dir: "/Volumes/<catalog>/<db>/code/${bundle.name}" => This creates a folder named "Volumes" in the workspace fs

- workspace.root_path: "s3://<s3-bucket>/volumes/code/${bundle.name}" => Error: unable to mkdir to write file s3:/<s3-bucket>/volumes/code/analytics_backend/state/deploy.lock: Path (s3:/<s3-bucket>/volumes/code/analytics_backend/state) doesn't start with '/'

Is there a way to directly deploy to an external volume via DAB? DBX for example supported deployments to abffs locations.

Thanks in advance!
David

1 REPLY 1

prith
New Contributor III

The path in the databricks.yml is with 2 slashes.. 

This is what I used 

root_path: s3://test-${var.sdlc_env}-us-east-1/dabs-artifacts/projects/.bundle/${bundle.name}/${bundle.target}

Yet still I got the error 

Error: Path (s3:/test-dev-us-east-1/dabs-artifacts/projects/.bundle/databricks-notebook-template-dabs/dev/files) doesn't start with '/'

 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now