<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How to provide env variables to a pipeline task in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/145768#M52574</link>
    <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;I create a job with one pipeline task through DAB. Now I want to provide a variable to it, but it is dynamic based on the target environment. As pipeline tasks do not support widgets, how can I provide this variable to the pipeline?&lt;/P&gt;</description>
    <pubDate>Thu, 29 Jan 2026 15:55:45 GMT</pubDate>
    <dc:creator>yit337</dc:creator>
    <dc:date>2026-01-29T15:55:45Z</dc:date>
    <item>
      <title>How to provide env variables to a pipeline task</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/145768#M52574</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;I create a job with one pipeline task through DAB. Now I want to provide a variable to it, but it is dynamic based on the target environment. As pipeline tasks do not support widgets, how can I provide this variable to the pipeline?&lt;/P&gt;</description>
      <pubDate>Thu, 29 Jan 2026 15:55:45 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/145768#M52574</guid>
      <dc:creator>yit337</dc:creator>
      <dc:date>2026-01-29T15:55:45Z</dc:date>
    </item>
    <item>
      <title>Re: How to provide env variables to a pipeline task</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/145772#M52577</link>
      <description>&lt;DIV&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;UNDER resources folder you can add these 2 files :&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;U&gt;&lt;STRONG&gt;file1.yml&lt;/STRONG&gt;&lt;/U&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;variables&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &lt;STRONG&gt;my_dynamic_variable&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;description&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;The name of the application.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;type&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;string&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;default&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;app_name&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;environment&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;description&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;The environment in which the application is running.&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;type&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;string&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;default&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;dev&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;U&gt;&lt;STRONG&gt;file2.yml&lt;/STRONG&gt;&lt;/U&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;U&gt;&lt;STRONG&gt;&amp;lt;this is just an example extracted from entire bundle yml file&amp;gt;&lt;/STRONG&gt;&lt;/U&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;tasks&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;task_key&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;my_pipeline_task&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;pipeline_task&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;pipeline_id&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;${resources.pipelines.my_pipeline.id}&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;parameters&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;SPAN&gt;environment&lt;/SPAN&gt;&lt;SPAN&gt;: &lt;/SPAN&gt;&lt;SPAN&gt;${var.environment}&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &lt;/SPAN&gt;&lt;STRONG&gt;&lt;STRONG&gt;app_name: ${var.my_dynamic_variable}&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/DIV&gt;&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;&lt;DIV&gt;&lt;STRONG&gt;&lt;STRONG&gt;&lt;STRONG&gt;databricks.yml can include folder having those 2 files specified above:&lt;BR /&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;include&lt;/SPAN&gt;&lt;SPAN&gt;:&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;&amp;nbsp; - &lt;/SPAN&gt;&lt;SPAN&gt;./resources/*.yml&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;STRONG&gt;&lt;STRONG&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/STRONG&gt;&lt;DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;# In your DLT pipeline notebook/file&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;environment&lt;/SPAN&gt; &lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt; spark.conf.get(&lt;/SPAN&gt;&lt;SPAN&gt;"environment"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;DIV&gt;&lt;SPAN&gt;dynamic_var&lt;/SPAN&gt; &lt;SPAN&gt;=&lt;/SPAN&gt;&lt;SPAN&gt;&lt;STRONG&gt; spark.conf.get&lt;/STRONG&gt;(&lt;/SPAN&gt;&lt;SPAN&gt;"&lt;STRONG&gt;app_name&lt;/STRONG&gt;"&lt;/SPAN&gt;&lt;SPAN&gt;)&lt;/SPAN&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;&lt;/DIV&gt;</description>
      <pubDate>Thu, 29 Jan 2026 16:13:09 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/145772#M52577</guid>
      <dc:creator>saurabh18cs</dc:creator>
      <dc:date>2026-01-29T16:13:09Z</dc:date>
    </item>
    <item>
      <title>Re: How to provide env variables to a pipeline task</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146014#M52598</link>
      <description>&lt;P&gt;These are the job parameters. Can I provide just for the pipeline task?&lt;/P&gt;</description>
      <pubDate>Fri, 30 Jan 2026 10:42:44 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146014#M52598</guid>
      <dc:creator>yit337</dc:creator>
      <dc:date>2026-01-30T10:42:44Z</dc:date>
    </item>
    <item>
      <title>Re: How to provide env variables to a pipeline task</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146169#M52619</link>
      <description>&lt;P&gt;I believe you can use configuration under settings in LDP and provide key and value pairs use key as parameter in code&lt;/P&gt;</description>
      <pubDate>Fri, 30 Jan 2026 18:01:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146169#M52619</guid>
      <dc:creator>IM_01</dc:creator>
      <dc:date>2026-01-30T18:01:04Z</dc:date>
    </item>
    <item>
      <title>Re: How to provide env variables to a pipeline task</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146393#M52638</link>
      <description>&lt;P&gt;You will have to pass it as configuration and then use it in pipeline code using spark.conf.get .&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="pradeep_singh_0-1769922979884.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/23541iBB920FCABCECD9DF/image-size/medium?v=v2&amp;amp;px=400" role="button" title="pradeep_singh_0-1769922979884.png" alt="pradeep_singh_0-1769922979884.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;and then in your pipeline notebook code you use .&amp;nbsp;&lt;/P&gt;&lt;P&gt;demo_catalog = spark.conf.get(demo_catalog);&lt;/P&gt;</description>
      <pubDate>Sun, 01 Feb 2026 05:16:51 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146393#M52638</guid>
      <dc:creator>pradeep_singh</dc:creator>
      <dc:date>2026-02-01T05:16:51Z</dc:date>
    </item>
    <item>
      <title>Re: How to provide env variables to a pipeline task</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146401#M52642</link>
      <description>&lt;P&gt;Does it have to be static? Or could it be dynamic from provided from another task?&lt;/P&gt;</description>
      <pubDate>Sun, 01 Feb 2026 08:18:46 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146401#M52642</guid>
      <dc:creator>yit337</dc:creator>
      <dc:date>2026-02-01T08:18:46Z</dc:date>
    </item>
    <item>
      <title>Re: How to provide env variables to a pipeline task</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146404#M52643</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/210475"&gt;@yit337&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;As&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/193958"&gt;@IM_01&lt;/a&gt;&amp;nbsp;and&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/202980"&gt;@pradeep_singh&lt;/a&gt;&amp;nbsp; mentioned you can use&amp;nbsp;&lt;SPAN&gt;configuration under settings in LDP and provide key and value pairs. Then you can refer to those parameters using&amp;nbsp;spark.conf.get(your_parameter_name) in pyspark or using ${} notation in&amp;nbsp;&amp;nbsp;SQL (as in example screenshot below):&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="szymon_dybczak_0-1769939315649.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/23543i40D8A646FDC381EA/image-size/medium?v=v2&amp;amp;px=400" role="button" title="szymon_dybczak_0-1769939315649.png" alt="szymon_dybczak_0-1769939315649.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;STRONG&gt;But those parameters are static&lt;/STRONG&gt;. If you really want to provide a dynamic values there's an ugly workaround. You can use databricks cli to override parameters as suggested in following reply:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="szymon_dybczak_0-1769938885514.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/23542iF2AC1DA126692C94/image-size/medium?v=v2&amp;amp;px=400" role="button" title="szymon_dybczak_0-1769938885514.png" alt="szymon_dybczak_0-1769938885514.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Below you have a link to entire discussion:&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;&lt;A href="https://community.databricks.com/t5/data-engineering/triggering-dlt-pipelines-with-dynamic-parameters/td-p/111581" target="_blank" rel="noopener"&gt;Triggering DLT Pipelines with Dynamic Parameters - Databricks Community - 111581&lt;/A&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Sun, 01 Feb 2026 09:48:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146404#M52643</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2026-02-01T09:48:54Z</dc:date>
    </item>
    <item>
      <title>Re: How to provide env variables to a pipeline task</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146416#M52644</link>
      <description>&lt;P&gt;If you’re looking to build a dynamic, configuration-driven DLT pipeline, a better approach is to use a configuration table. This table should include fields such as table_name, pipeline_name, table_properties, and other relevant settings. Your notebook can then query this table, applying filters for the table and pipeline names that are passed dynamically through variables. The resolved properties can then be accessed directly within your code.&lt;/P&gt;&lt;P&gt;You can always update the parameters and have things dynamic by updating this table .&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 01 Feb 2026 14:10:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/146416#M52644</guid>
      <dc:creator>pradeep_singh</dc:creator>
      <dc:date>2026-02-01T14:10:22Z</dc:date>
    </item>
    <item>
      <title>Re: How to provide env variables to a pipeline task</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/150181#M53290</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/210475"&gt;@yit337&lt;/a&gt;,&lt;/P&gt;
&lt;P&gt;Since pipeline tasks in Databricks Jobs do not support widgets the way notebook tasks do, the recommended approach is to use pipeline configuration parameters. These are key-value pairs you set in the pipeline definition, and you can make them dynamic per environment using Databricks Asset Bundle (DAB) variables and target overrides.&lt;/P&gt;
&lt;P&gt;Here is how it works end to end:&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 1: DEFINE VARIABLES IN YOUR BUNDLE&lt;/P&gt;
&lt;P&gt;In your databricks.yml, declare a variable with a default value and override it per target:&lt;/P&gt;
&lt;P&gt;variables:&lt;BR /&gt;my_env_setting:&lt;BR /&gt;description: "Environment-specific setting for my pipeline"&lt;BR /&gt;default: "dev_value"&lt;/P&gt;
&lt;P&gt;targets:&lt;BR /&gt;dev:&lt;BR /&gt;default: true&lt;BR /&gt;variables:&lt;BR /&gt;my_env_setting: "dev_value"&lt;BR /&gt;staging:&lt;BR /&gt;variables:&lt;BR /&gt;my_env_setting: "staging_value"&lt;BR /&gt;prod:&lt;BR /&gt;variables:&lt;BR /&gt;my_env_setting: "prod_value"&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 2: PASS THE VARIABLE INTO YOUR PIPELINE CONFIGURATION&lt;/P&gt;
&lt;P&gt;In your pipeline resource definition, reference the bundle variable using the ${var.&amp;lt;name&amp;gt;} syntax inside the configuration block:&lt;/P&gt;
&lt;P&gt;resources:&lt;BR /&gt;pipelines:&lt;BR /&gt;my_pipeline:&lt;BR /&gt;name: "my-pipeline"&lt;BR /&gt;configuration:&lt;BR /&gt;"my_env_setting": "${var.my_env_setting}"&lt;BR /&gt;libraries:&lt;BR /&gt;- notebook:&lt;BR /&gt;path: ./my_notebook.py&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;STEP 3: READ THE PARAMETER IN YOUR PIPELINE CODE&lt;/P&gt;
&lt;P&gt;In Python, use spark.conf.get() to retrieve the value:&lt;/P&gt;
&lt;P&gt;from pyspark import pipelines as dp&lt;/P&gt;
&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/25059"&gt;@DP&lt;/a&gt;.table&lt;BR /&gt;def my_table():&lt;BR /&gt;env_setting = spark.conf.get("my_env_setting")&lt;BR /&gt;# Use env_setting in your logic&lt;BR /&gt;return spark.read.table(f"{env_setting}.my_schema.my_source_table")&lt;/P&gt;
&lt;P&gt;In SQL, use the ${} template syntax:&lt;/P&gt;
&lt;P&gt;CREATE OR REFRESH MATERIALIZED VIEW my_view AS&lt;BR /&gt;SELECT *&lt;BR /&gt;FROM ${my_env_setting}.my_schema.my_source_table&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;IMPORTANT NOTES&lt;/P&gt;
&lt;P&gt;1. Parameter keys can only contain alphanumeric characters, underscores (_), hyphens (-), and dots (.).&lt;BR /&gt;2. Parameter values are always set as strings.&lt;BR /&gt;3. You can also pass variables at deployment time from the CLI:&lt;BR /&gt;databricks bundle deploy --var="my_env_setting=custom_value"&lt;BR /&gt;4. If you prefer, you can also set overrides in a file at .databricks/bundle/&amp;lt;target&amp;gt;/variable-overrides.json.&lt;/P&gt;
&lt;P&gt;DOCUMENTATION REFERENCES&lt;/P&gt;
&lt;P&gt;- Use parameters with pipelines: &lt;A href="https://docs.databricks.com/aws/en/delta-live-tables/parameters.html" target="_blank"&gt;https://docs.databricks.com/aws/en/delta-live-tables/parameters.html&lt;/A&gt;&lt;BR /&gt;- Databricks Asset Bundle variables: &lt;A href="https://docs.databricks.com/aws/en/dev-tools/bundles/variables.html" target="_blank"&gt;https://docs.databricks.com/aws/en/dev-tools/bundles/variables.html&lt;/A&gt;&lt;BR /&gt;- Configure a pipeline: &lt;A href="https://docs.databricks.com/aws/en/delta-live-tables/configure-pipeline.html" target="_blank"&gt;https://docs.databricks.com/aws/en/delta-live-tables/configure-pipeline.html&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;This combination of DAB variables (for environment-specific values) and pipeline configuration parameters (for runtime access in code) gives you a clean way to handle environment-driven configuration without widgets.&lt;/P&gt;
&lt;P&gt;* This reply used an agent system I built to research and draft this response based on the wide set of documentation I have available and previous memory. I personally review the draft for any obvious issues and for monitoring system reliability and update it when I detect any drift, but there is still a small chance that something is inaccurate, especially if you are experimenting with brand new features.&lt;/P&gt;</description>
      <pubDate>Sun, 08 Mar 2026 07:30:27 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-provide-env-variables-to-a-pipeline-task/m-p/150181#M53290</guid>
      <dc:creator>SteveOstrowski</dc:creator>
      <dc:date>2026-03-08T07:30:27Z</dc:date>
    </item>
  </channel>
</rss>

