<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic DLT &amp;amp; Publishing to Feature Store in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/dlt-amp-publishing-to-feature-store/m-p/36938#M26225</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Is there an example of incorporating Databricks Feature Store into DLT pipelines?&amp;nbsp; Is this possible natively via a Python notebook part of the pipeline (FYI - docs say needs ML Runtime?).&amp;nbsp; If not completely DLT-able, what is the best current way to do this - e.g is an outer workflow needed -&lt;/P&gt;&lt;P&gt;Workflow:DLTStep (dlt infra) -&amp;gt; Workflow: MLFeatureStoreStep (ml infra) -&amp;gt;Workflow:MLOnlineInference (dlt infra)&lt;/P&gt;&lt;P&gt;Obviously not ideal, extra cost to spin up ML infra as well.&lt;/P&gt;&lt;P&gt;Our use case is to push already 'silver-ised' features (in DLT pipelines) into feature store using Python API to be used for training (batch) and online so want to keep the featurisation code within DLT&lt;/P&gt;&lt;P&gt;If so, is there any reference-able examples - blogs, github, notebooks etc to help?&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
    <pubDate>Tue, 04 Jul 2023 13:23:22 GMT</pubDate>
    <dc:creator>kurt</dc:creator>
    <dc:date>2023-07-04T13:23:22Z</dc:date>
    <item>
      <title>DLT &amp; Publishing to Feature Store</title>
      <link>https://community.databricks.com/t5/data-engineering/dlt-amp-publishing-to-feature-store/m-p/36938#M26225</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;Is there an example of incorporating Databricks Feature Store into DLT pipelines?&amp;nbsp; Is this possible natively via a Python notebook part of the pipeline (FYI - docs say needs ML Runtime?).&amp;nbsp; If not completely DLT-able, what is the best current way to do this - e.g is an outer workflow needed -&lt;/P&gt;&lt;P&gt;Workflow:DLTStep (dlt infra) -&amp;gt; Workflow: MLFeatureStoreStep (ml infra) -&amp;gt;Workflow:MLOnlineInference (dlt infra)&lt;/P&gt;&lt;P&gt;Obviously not ideal, extra cost to spin up ML infra as well.&lt;/P&gt;&lt;P&gt;Our use case is to push already 'silver-ised' features (in DLT pipelines) into feature store using Python API to be used for training (batch) and online so want to keep the featurisation code within DLT&lt;/P&gt;&lt;P&gt;If so, is there any reference-able examples - blogs, github, notebooks etc to help?&lt;/P&gt;&lt;P&gt;Thanks!&lt;/P&gt;</description>
      <pubDate>Tue, 04 Jul 2023 13:23:22 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/dlt-amp-publishing-to-feature-store/m-p/36938#M26225</guid>
      <dc:creator>kurt</dc:creator>
      <dc:date>2023-07-04T13:23:22Z</dc:date>
    </item>
  </channel>
</rss>

