cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Dabs Databricks asset bundles

IONA
New Contributor III

Hi!

I am relatively new to Dabs, but getting on quite well.

I have managed to deploy both a job that uses a notebook defined in the bundle itself and a job that points to a notebook living in an azure devops git repo. 

While these are two viable solutions I would like for the notebook so be deployed out of the repo and into the .bundle folder and have the job point to the code in there.

I can make this happen using "source_linked_deployment" for a notebook included in the bundle but not for one in the repo.

I'm sure there must be a simple way of doing this but I haven't managed to spot how to do this.

Rgds

Gary

1 REPLY 1

saurabh18cs
Honored Contributor II

Hi @IONA ,

you need to add a step into your CD pipeline to copy the notebook:

- checkout: self
- script: |
cp path/to/notebook_in_repo/notebook.py .bundle/notebook.py
displayName: 'Copy notebook into bundle'
- script: |
databricks bundle deploy
displayName: 'Deploy Databricks Bundle'

witin DAB yaml:

resources:
 jobs:
  my_job:
    tasks:
     - task_key: run_notebook
         notebook_task:
            notebook_path: ./notebook.py