- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-10-2025 11:03 PM
Hi there,
I have a usecase where I want to set the dlt pipeline id in the configuration parameters of that dlt pipeline.
The way we can use workspace ids or task id in notebook task task_id = {{task.id}}/ {{task.name}} and can save them as parameters and can call later as dbutils.widgets.get("task_id").
Can we do something similar in dlt pipeline.
like dlt_pipeline_id = {{dlt_pipeline.name}} in the configurations and then use it spark.conf.get("dlt_pipeline_id").
or is there any other way to achieve this
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-18-2025 06:18 AM
Hi @mourakshit ,
I tried all the three methods you mentioned . None of them worked
method_1 returne pipeline_name or pipelin_id as printed value : {{dlt_pipeline.name}} {{dlt_pipeline.id}} not the actual values
methond_2 returned not conf like spark.databricks.pipeline.name exists
method_3 same error like method_1
It would be really helpfull if you can show a demo example or screenshot by using any of the methods will be really helpful.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-12-2025 10:06 PM - edited 02-12-2025 10:08 PM
Any ideas on this @VZLA, @Alberto_Umana
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-13-2025 11:02 AM
{{dlt_pipeline.name}}
in the configuration{{dlt_pipeline.name}}
syntax in your DLT pipeline configuration, just like you would in a notebook task. This will replace the placeholder with the actual name of the DLT pipeline.
spark.conf.get("dlt_pipeline_id")
.spark.conf.set
in the DLT pipelinespark.conf.set
method to set the dlt_pipeline_id
configuration parameter in your DLT pipeline.
dlt_pipeline_id
configuration parameter to the name of the current DLT pipeline.
dbutils.widgets.get("dlt_pipeline_id")
.- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-18-2025 06:18 AM
Hi @mourakshit ,
I tried all the three methods you mentioned . None of them worked
method_1 returne pipeline_name or pipelin_id as printed value : {{dlt_pipeline.name}} {{dlt_pipeline.id}} not the actual values
methond_2 returned not conf like spark.databricks.pipeline.name exists
method_3 same error like method_1
It would be really helpfull if you can show a demo example or screenshot by using any of the methods will be really helpful.

