Hi,
Is there an example of incorporating Databricks Feature Store into DLT pipelines? Is this possible natively via a Python notebook part of the pipeline (FYI - docs say needs ML Runtime?). If not completely DLT-able, what is the best current way to do this - e.g is an outer workflow needed -
Workflow:DLTStep (dlt infra) -> Workflow: MLFeatureStoreStep (ml infra) ->Workflow:MLOnlineInference (dlt infra)
Obviously not ideal, extra cost to spin up ML infra as well.
Our use case is to push already 'silver-ised' features (in DLT pipelines) into feature store using Python API to be used for training (batch) and online so want to keep the featurisation code within DLT
If so, is there any reference-able examples - blogs, github, notebooks etc to help?
Thanks!