<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: MLFlow Serve Logging in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15493#M9812</link>
    <description>&lt;P&gt;Another word from a Databricks employee:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;"""&lt;/P&gt;&lt;P&gt;You can use the custom model approach but configuring it is painful. Plus you have ended every loggable model in the custom model.&amp;nbsp;Another less intrusive solution would be to have a proxy server do the logging and then defer to MLflow model server. See very basic POC:&amp;nbsp;&lt;A href="https://github.com/amesar/mlflow-model-monitoring" alt="https://github.com/amesar/mlflow-model-monitoring" target="_blank"&gt;https://github.com/amesar/mlflow-model-monitoring&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Also check out Seldon Alibi for advanced monitoring.&lt;/P&gt;&lt;P&gt;""&lt;/P&gt;</description>
    <pubDate>Wed, 15 Sep 2021 01:14:55 GMT</pubDate>
    <dc:creator>Dan_Z</dc:creator>
    <dc:date>2021-09-15T01:14:55Z</dc:date>
    <item>
      <title>MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15485#M9804</link>
      <description>&lt;P&gt;When using Azure Databricks and serving a model, we have received requests to capture additional logging. In some instances, they would like to capture input and output  or even some of the steps from a pipeline. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Is there any way we can extend the logging with a MLFlow rest endpoint to capture additional required information?&lt;/P&gt;</description>
      <pubDate>Tue, 14 Sep 2021 13:59:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15485#M9804</guid>
      <dc:creator>BeardyMan</dc:creator>
      <dc:date>2021-09-14T13:59:34Z</dc:date>
    </item>
    <item>
      <title>Re: MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15487#M9806</link>
      <description>&lt;P&gt;To my knowledge, if you write a custom model's predict() function, you can do any arbitrary operations in it (log inputs or outputs somewhere). &lt;/P&gt;</description>
      <pubDate>Tue, 14 Sep 2021 18:19:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15487#M9806</guid>
      <dc:creator>Dan_Z</dc:creator>
      <dc:date>2021-09-14T18:19:08Z</dc:date>
    </item>
    <item>
      <title>Re: MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15488#M9807</link>
      <description>&lt;P&gt;Do you mean to use azure functions and custom python code to call the model and then perform the logging required rather than using the mlflow serve capability and the managed rest endpoint? ​&lt;/P&gt;</description>
      <pubDate>Tue, 14 Sep 2021 19:32:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15488#M9807</guid>
      <dc:creator>BeardyMan</dc:creator>
      <dc:date>2021-09-14T19:32:02Z</dc:date>
    </item>
    <item>
      <title>Re: MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15489#M9808</link>
      <description>&lt;P&gt;My thought was:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;Create a custom model with a predict function that does extra work (like logging)&lt;/LI&gt;&lt;LI&gt;Register the Model&lt;/LI&gt;&lt;LI&gt;Run the model in Model Serving&lt;/LI&gt;&lt;/OL&gt;</description>
      <pubDate>Tue, 14 Sep 2021 21:05:19 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15489#M9808</guid>
      <dc:creator>Dan_Z</dc:creator>
      <dc:date>2021-09-14T21:05:19Z</dc:date>
    </item>
    <item>
      <title>Re: MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15490#M9809</link>
      <description>&lt;P&gt;Here is an example of a custom model based on the sklearn model "GradientBoostingClassifier":&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;class CustomizedGradientBoostingClassifier(sklearn.ensemble.GradientBoostingClassifier):
  def __init__(self, random_state):
    super().__init__(random_state=random_state)
  
  def fit(self, X, y):
    super().fit(X, y)
  
  def predict_proba(self, X_test):
    return super().predict_proba(X_test)
  
  def predict(self, X):
    # Do customized tasks here (e.g. issueing an RPC calll to log the input and output)
    
    # For example, you can also return not only the predicted result, but also the input
    return (super().predict(X), X)&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;You can register the model as usual. When you invoke the REST endpoint, it does some custom things in the predict() function, and returns not only the predicted result, but also the input.&lt;/P&gt;</description>
      <pubDate>Tue, 14 Sep 2021 21:37:45 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15490#M9809</guid>
      <dc:creator>ChenranLi</dc:creator>
      <dc:date>2021-09-14T21:37:45Z</dc:date>
    </item>
    <item>
      <title>Re: MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15491#M9810</link>
      <description>&lt;P&gt;Thank you for the clarification, I understand what you mean now and that's exactly what I was hoping for! &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 14 Sep 2021 22:46:00 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15491#M9810</guid>
      <dc:creator>BeardyMan</dc:creator>
      <dc:date>2021-09-14T22:46:00Z</dc:date>
    </item>
    <item>
      <title>Re: MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15492#M9811</link>
      <description>&lt;P&gt;Thank you @Chenran Li​&amp;nbsp; the example is exceedingly helpful. I will be sure to try this out!&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 14 Sep 2021 22:46:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15492#M9811</guid>
      <dc:creator>BeardyMan</dc:creator>
      <dc:date>2021-09-14T22:46:40Z</dc:date>
    </item>
    <item>
      <title>Re: MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15493#M9812</link>
      <description>&lt;P&gt;Another word from a Databricks employee:&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;"""&lt;/P&gt;&lt;P&gt;You can use the custom model approach but configuring it is painful. Plus you have ended every loggable model in the custom model.&amp;nbsp;Another less intrusive solution would be to have a proxy server do the logging and then defer to MLflow model server. See very basic POC:&amp;nbsp;&lt;A href="https://github.com/amesar/mlflow-model-monitoring" alt="https://github.com/amesar/mlflow-model-monitoring" target="_blank"&gt;https://github.com/amesar/mlflow-model-monitoring&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Also check out Seldon Alibi for advanced monitoring.&lt;/P&gt;&lt;P&gt;""&lt;/P&gt;</description>
      <pubDate>Wed, 15 Sep 2021 01:14:55 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15493#M9812</guid>
      <dc:creator>Dan_Z</dc:creator>
      <dc:date>2021-09-15T01:14:55Z</dc:date>
    </item>
    <item>
      <title>Re: MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15494#M9813</link>
      <description>&lt;P&gt;Thank you, Dan. We had originally suggested the route of using azure api manager or using an azure function as like an api wrapper to do the logging we want and the forwarding on the call to the mlfmow model serve rest endpoint. I was just wondering if there was a better alternative or something obvious we were missing. ​&lt;/P&gt;</description>
      <pubDate>Wed, 15 Sep 2021 19:54:50 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/15494#M9813</guid>
      <dc:creator>BeardyMan</dc:creator>
      <dc:date>2021-09-15T19:54:50Z</dc:date>
    </item>
    <item>
      <title>Re: MLFlow Serve Logging</title>
      <link>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/83256#M36892</link>
      <description>&lt;P&gt;hey Dan, we do that but in my case I dont see the logs in the event logs tab. where could they be?&lt;/P&gt;</description>
      <pubDate>Fri, 16 Aug 2024 17:36:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/mlflow-serve-logging/m-p/83256#M36892</guid>
      <dc:creator>zainabs</dc:creator>
      <dc:date>2024-08-16T17:36:54Z</dc:date>
    </item>
  </channel>
</rss>

