cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Fething the catalog and schema which is set in dlt pipeline configuration

ashraf1395
Honored Contributor

I have a dlt pipeline and the notebook which is running on the dlt pipeline has some requirements.

I want to get the catalog and schema which is set my dlt pipeline. 
Reason for it: I have to specify my volume files paths etc and my volume is on the same catalog and schema where I have set my catalog and schema for my dlt pipeline configuration.

is there any way like spark.conf.get("catalog") or scehma or any way to get my dlt pipeline configurations. 
I can use databricks API but for that as well I will need the dlt pipeline_id and use the dlt_pipeline_id for the same dlt pipeline doesn't sound convincing.

1 REPLY 1

SP_6721
New Contributor III

Hi @ashraf1395 

Can you try this to get the catalog and schema set by your DLT pipeline in the notebook

catalog = spark.conf.get("pipelines.catalog")
schema = spark.conf.get("pipelines.schema")

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now