Hi,
I have a job consisting of three tasks:
tasks:
- task_key: Kinesis_to_S3_new
spark_python_task:
python_file: ../src/kinesis.py
parameters: ["${var.stream_region}", "${var.s3_base_path}"]
job_cluster_key: general_cluster
# Run delta live view
- task_key: delta_live_view_file
pipeline_task:
pipeline_id: ${resources.pipelines.dlt_file_pipeline.id}
depends_on:
- task_key: Kinesis_to_S3_new
# Run dbt
- task_key: dbt
depends_on:
- task_key: delta_live_view_file
dbt_task:
project_directory: ./dbt
commands:
Without the last task (the dbt task), the bundle can be validated, deployed and executed fine. After adding the dbt task, I receive an error when deploying the bundle:
Error: cannot create job: Invalid python file reference: ../src/kinesis.py
Does the project_directory element cause this? Did I use it incorrectly and if so, how is it done correctly?
best
Mathias