I have a dlt pipeline and the notebook which is running on the dlt pipeline has some requirements.
I want to get the catalog and schema which is set my dlt pipeline.
Reason for it: I have to specify my volume files paths etc and my volume is on the same catalog and schema where I have set my catalog and schema for my dlt pipeline configuration.
is there any way like spark.conf.get("catalog") or scehma or any way to get my dlt pipeline configurations.
I can use databricks API but for that as well I will need the dlt pipeline_id and use the dlt_pipeline_id for the same dlt pipeline doesn't sound convincing.