How do I access to DLT advanced configuration from python notebook?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-13-2023 12:19 PM
Hi Team,
Im trying to get DLT Advanced Configuration value from the python dlt notebook.
For example, I set "something": "some path" in Advanced configuration in DLT and I want to get the value from my dlt notebook. I tried "dbutils.widgets.get("something") in my notebook and triggered the pipeline but I got an error.
How can I get the value?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-18-2024 03:02 PM
The following docs will help. Please check the examples https://docs.databricks.com/en/delta-live-tables/settings.html#parameterize-pipelines
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-18-2024 07:13 AM - edited 06-19-2024 01:24 AM
Hi @jose_gonzalez , I am getting the same error as the author of the post. in this example what "mypipeline" is referring to?
because I am getting the following error:
org.apache.spark.SparkNoSuchElementException: [SQL_CONF_NOT_FOUND] The SQL config "mypipeline.host" cannot be found. Please verify that the config exists.
EDIT: I have figured out. To use advanced configuration value we just need to use it as following:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-19-2024 07:32 AM
here you can find the documentation on how to use parameters in dlt (sql and python):

