Dabs Databricks asset bundles

IONA
New Contributor III

Hi!

I am relatively new to Dabs, but getting on quite well.

I have managed to deploy both a job that uses a notebook defined in the bundle itself and a job that points to a notebook living in an azure devops git repo. 

While these are two viable solutions I would like for the notebook so be deployed out of the repo and into the .bundle folder and have the job point to the code in there.

I can make this happen using "source_linked_deployment" for a notebook included in the bundle but not for one in the repo.

I'm sure there must be a simple way of doing this but I haven't managed to spot how to do this.

Rgds

Gary

saurabh18cs
Honored Contributor III

Hi @IONA ,

you need to add a step into your CD pipeline to copy the notebook:

- checkout: self
- script: |
cp path/to/notebook_in_repo/notebook.py .bundle/notebook.py
displayName: 'Copy notebook into bundle'
- script: |
databricks bundle deploy
displayName: 'Deploy Databricks Bundle'

witin DAB yaml:

resources:
 jobs:
  my_job:
    tasks:
     - task_key: run_notebook
         notebook_task:
            notebook_path: ./notebook.py