Hi @lndlzy, To integrate MLflow Recipes with Databricks Feature Store, follow these steps.
1. **Define Features**: Write code to convert raw data into features and create a Spark DataFrame containing the desired features. If your workspace is enabled, write the DataFrame as a feature table in the Workspace Feature Store.
2. **Train Model**: Train a model using features from the feature store. When you do this, the model stores the specifications of features used for training. When the model is used for inference, it automatically joins features from the appropriate feature tables.
3. **Register Model**: Register the trained model in the Model Registry.
4. **Model Inference**: You can now use the model to predict new data. The model automatically retrieves the features it needs from the Feature Store. For real-time serving use cases, publish the features to an online store. At inference time, the model reads pre-computed features from the online store and joins them with the data provided in the client request to the model serving endpoint.
You can check the Basic Feature Store example notebook provided in the Databricks documentation for a step-by-step guide. It guides you through how to create a feature store table, use it to train a model, and then perform batch scoring using automatic feature lookup.