Databricks workflow deployment issue
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-19-2024 10:24 PM
The below one is the data bricks workflow. Note: ETL_schema check is if/else task in databricks workflow)
Declaring below taskValues based on some conditions in ETL_data_check notebooks. Based on the below output the next task"ETL_schema_checks" (if/else) will be executed.
if failure:
dbutils.jobs.taskValues.set(key="flag_mismatch",value="failure")
else:
dbutils.jobs.taskValues.set(key="flag_mismatch",value="success")
using above if/else condtition etl_data_checks.
i am trying to deploy the workflow using databricks asset bundles. Below is the job definition. while deploying the code i am getting error "A managed resource tasks" "ETL_data_checks" has not been declared in the root module."
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-18-2024 07:07 PM
Hi @suryateja405555, How are you doing today?
As per my understanding, Make sure ETL_data_checks is properly declared in the tasks section of your workflow configuration in the root module. For example, add it with a task_key and its respective properties like notebook_path and cluster_key. Verify that ETL_schema_checks references the correct task_key for ETL_data_checks, ensuring the dependency aligns with the workflow logic. Double-check naming consistency between the task definition and its references to avoid any mismatches. After fixing, validate the asset bundle using the Databricks CLI to ensure there are no errors in the configuration. Lastly, confirm that all tasks have their cluster resources appropriately associated.
Please give a try and let me know if helps. Good day.
Regards,
Brahma

