Hi all 🙂
I used the new Lakeflow UI in order to create a pipeline. Now I am struggling with the asset bundle configuration.
When I am creating the pipeline manually I can configure the correct folder to the transformations where my sql and python transformation files are stored like this:
But I dont know how to configure this via asset bundles.
Right now I have this code in my resources/pipeline.yml
resources:
pipelines:
cdc_cdr_demo_pipeline:
name: cdc_cdr_demo_pipeline
libraries:
- file:
path: /Workspace/Users/[my-user-name]/.bundle/demo_cdc_cdr_pipeline/dev/files/transformations/
serverless: true
photon: true
catalog: ${var.catalog}
schema: ${var.schema}
development: true
channel: "PREVIEW"
and this in my databricks.yml :
bundle:
name: demo_cdc_cdr_pipeline_databricks_bundle
include:
- resources/*.yml
variables:
catalog:
description: The catalog to use
schema:
description: The schema to use
notifications:
description: The email addresses to use for failure notifications
targets:
dev:
# The default target uses 'mode: development' to create a development copy.
# - Deployed resources get prefixed with '[dev my_user_name]'
# - Any job schedules and triggers are paused by default.
# See also https://docs.databricks.com/dev-tools/bundles/deployment-modes.html.
mode: development
default: true
workspace:
host: host-url
root_path: /Workspace/Users/${workspace.current_user.userName}/.bundle/demo_cdc_cdr_pipeline/dev
variables:
# env: "dev_${workspace.current_user.short_name}_"
catalog: sandbox
schema: user_name
notifications: []
prod:
mode: production
workspace:
host: host-url
variables:
catalog: sandbox_dui_prod
schema: user_name
notifications: []
What is the correct asset bundle configuration to set all files in the transformations folder as source code in the new Lakeflow?
Thanks a lot in advance and best regards,
Susanne