<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Feature Store Model Serving endpoint in Machine Learning</title>
    <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75728#M3387</link>
    <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/101172"&gt;@NaeemS&lt;/a&gt;, it's hard to say given how uninformative the error is. I will try to give it a go next week but maybe you can help me by answering a few questions:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Can you paste the exact line of code that trigger the error?&lt;/LI&gt;&lt;LI&gt;Does the sklearn model also use the Feature Store? Is the only difference in the library used (pyspark vs sklearn)?&lt;/LI&gt;&lt;LI&gt;Is your Online Feature Store correctly configured with the right credentials to retrieve the latest feature set?&lt;/LI&gt;&lt;LI&gt;What happens if you remove the&lt;PRE&gt;databricks-feature-lookup==0.*​&lt;/PRE&gt;from your requirements file?&lt;/LI&gt;&lt;/UL&gt;</description>
    <pubDate>Tue, 25 Jun 2024 14:56:42 GMT</pubDate>
    <dc:creator>robbe</dc:creator>
    <dc:date>2024-06-25T14:56:42Z</dc:date>
    <item>
      <title>Feature Store Model Serving endpoint</title>
      <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75249#M3377</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I am trying to deploy my model which was logged by featureStoreEngineering client as a serving endpoint in Databricks. But I am facing following error:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt; The Databricks Lookup client from databricks-feature-lookup and Databricks Feature Store client from databricks-feature-engineering cannot be installed in the same python environment. &lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When I log my model using FeatureStoreEngineeringClient&amp;nbsp;databricks-feature-lookup is added by default in my requirements file, and when I load that model in other environment I get this error. Also, when I add&amp;nbsp;databricks-feature-engineering ad dependency I get the same error in my model serving endpoint creation. Also, if I don't add&amp;nbsp;databricks-feature-engineering as a dependency I will get other errors while creation of my serving endpoint.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;model_impl = importlib.import_module(conf[MAIN])._load_pyfunc(data_path) [cc8d72wmj6] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/mlflow/spark/__init__.py", line 898, in _load_pyfunc [cc8d72wmj6] spark = _create_local_spark_session_for_loading_spark_model() [cc8d72wmj6] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/mlflow/utils/_spark_utils.py", line 131, in _create_local_spark_session_for_loading_spark_model [cc8d72wmj6] .getOrCreate() [cc8d72wmj6] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/pyspark/sql/session.py", line 497, in getOrCreate [cc8d72wmj6] sc = SparkContext.getOrCreate(sparkConf) [cc8d72wmj6] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/pyspark/context.py", line 515, in getOrCreate [cc8d72wmj6] SparkContext(conf=conf or SparkConf()) [cc8d72wmj6] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/pyspark/context.py", line 201, in __init__ [cc8d72wmj6] SparkContext._ensure_initialized(self, gateway=gateway, conf=conf) [cc8d72wmj6] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/pyspark/context.py", line 436, in _ensure_initialized [cc8d72wmj6] SparkContext._gateway = gateway or launch_gateway(conf) [cc8d72wmj6] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/pyspark/java_gateway.py", line 107, in launch_gateway [cc8d72wmj6] raise PySparkRuntimeError( [cc8d72wmj6] pyspark.errors.exceptions.base.PySparkRuntimeError: [JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number. [cc8d72wmj6] [2024-06-20 21:04:06 +0000] [10] [ERROR] Error handling request /v2/health/ready I would appreciate any help in solving this issue.&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;</description>
      <pubDate>Thu, 20 Jun 2024 22:03:30 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75249#M3377</guid>
      <dc:creator>NaeemS</dc:creator>
      <dc:date>2024-06-20T22:03:30Z</dc:date>
    </item>
    <item>
      <title>Re: Feature Store Model Serving endpoint</title>
      <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75361#M3381</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/9"&gt;@Retired_mod&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;Thanks for your response. But I have limited access to shared cluster and can not use it for this purpose. Can you tell me how I can prevent &lt;FONT color="#FF0000"&gt;&lt;EM&gt;&lt;STRONG&gt;databricks-feature-lookup&amp;nbsp;&lt;/STRONG&gt;&lt;/EM&gt;&lt;FONT color="#000000"&gt;from being added in my requirements file when I am logging my model using FeatureStoreEngineeringClient which is causing this issue.&amp;nbsp;&lt;/FONT&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&lt;FONT color="#FF0000"&gt;&lt;FONT color="#000000"&gt;Or any other workaround for solving this issue with single user cluster.&lt;/FONT&gt;&lt;/FONT&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 21 Jun 2024 16:54:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75361#M3381</guid>
      <dc:creator>NaeemS</dc:creator>
      <dc:date>2024-06-21T16:54:02Z</dc:date>
    </item>
    <item>
      <title>Re: Feature Store Model Serving endpoint</title>
      <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75519#M3384</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/101172"&gt;@NaeemS&lt;/a&gt; , but the issue should not arise from having &lt;STRONG&gt;databricks-feature-lookup&lt;/STRONG&gt; in your serving endpoint, quite the opposite. You should have the &lt;STRONG&gt;databricks-feature-lookup&lt;/STRONG&gt; dependency but not the &lt;STRONG&gt;databricks-feature-engineering &lt;/STRONG&gt;dependency.&lt;/P&gt;&lt;P&gt;Can you please elaborate more on the error that you mentioned in the OP, perhaps including some reproducible code?&lt;/P&gt;</description>
      <pubDate>Sun, 23 Jun 2024 19:22:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75519#M3384</guid>
      <dc:creator>robbe</dc:creator>
      <dc:date>2024-06-23T19:22:54Z</dc:date>
    </item>
    <item>
      <title>Re: Feature Store Model Serving endpoint</title>
      <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75637#M3385</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/102750"&gt;@robbe&lt;/a&gt;&amp;nbsp;,&amp;nbsp;&lt;BR /&gt;I am facing following errors when creating serving endpoint for my spark pipeline&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;5548dsptvc] raise self._exception
[5548dsptvc] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/mlflowserving/scoring_server/__init__.py", line 194, in get_model_option_or_exit
[5548dsptvc] self.model = self.model_future.result()
[5548dsptvc] File "/opt/conda/envs/mlflow-env/lib/python3.10/concurrent/futures/_base.py", line 451, in result
[5548dsptvc] return self.__get_result()
[5548dsptvc] File "/opt/conda/envs/mlflow-env/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
[5548dsptvc] raise self._exception
[5548dsptvc] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/mlflowserving/scoring_server/__init__.py", line 194, in get_model_option_or_exit
[5548dsptvc] self.model = self.model_future.result()
[5548dsptvc] File "/opt/conda/envs/mlflow-env/lib/python3.10/concurrent/futures/_base.py", line 451, in result
[5548dsptvc] return self.__get_result()
[5548dsptvc] File "/opt/conda/envs/mlflow-env/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
[5548dsptvc] raise self._exception
[5548dsptvc] File "/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/mlflowserving/scoring_server/__init__.py", line 194, in get_model_option_or_exit
[5548dsptvc] self.model = self.model_future.result()
[5548dsptvc] File "/opt/conda/envs/mlflow-env/lib/python3.10/concurrent/futures/_base.py", line 451, in result
[5548dsptvc] return self.__get_result()&lt;/LI-CODE&gt;&lt;P&gt;I have tested creating a sklearn model in my environment which is working fine, but for the spark pipeline I am unable to do so , following is my requirement file which is being logged with my model.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;mlflow==2.14.1
numpy==1.21.5
pyspark==3.5.1
scipy==1.9.1
databricks-feature-lookup==0.*&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;databricks-feature-lookup&amp;nbsp;1.2.14 is being installed in my environment as the earlier versions has been deprecated.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;</description>
      <pubDate>Mon, 24 Jun 2024 22:14:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75637#M3385</guid>
      <dc:creator>NaeemS</dc:creator>
      <dc:date>2024-06-24T22:14:42Z</dc:date>
    </item>
    <item>
      <title>Re: Feature Store Model Serving endpoint</title>
      <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75728#M3387</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/101172"&gt;@NaeemS&lt;/a&gt;, it's hard to say given how uninformative the error is. I will try to give it a go next week but maybe you can help me by answering a few questions:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Can you paste the exact line of code that trigger the error?&lt;/LI&gt;&lt;LI&gt;Does the sklearn model also use the Feature Store? Is the only difference in the library used (pyspark vs sklearn)?&lt;/LI&gt;&lt;LI&gt;Is your Online Feature Store correctly configured with the right credentials to retrieve the latest feature set?&lt;/LI&gt;&lt;LI&gt;What happens if you remove the&lt;PRE&gt;databricks-feature-lookup==0.*​&lt;/PRE&gt;from your requirements file?&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Tue, 25 Jun 2024 14:56:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75728#M3387</guid>
      <dc:creator>robbe</dc:creator>
      <dc:date>2024-06-25T14:56:42Z</dc:date>
    </item>
    <item>
      <title>Re: Feature Store Model Serving endpoint</title>
      <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75874#M3393</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/102750"&gt;@robbe&lt;/a&gt;&amp;nbsp;,&amp;nbsp;&lt;/P&gt;&lt;P&gt;My model was logged successfully using feature engineering client. The issue appears while creating a serving endpoint for that model.&lt;BR /&gt;For the Sklearn model I used a simple Random Forest model and for the spark model I am logging a pipeline with multiple steps like imputer, indexer, assembler and model.&lt;BR /&gt;Also, I am not using 3rd part feature stores here. I am using online tables within Unity Catalog, this is relatively a new feature introduced by Databricks. I am using same store with sklearn model and it is working fine, the issue appears only with spark pipeline.&lt;/P&gt;&lt;P&gt;While logging my model I am not specifying&amp;nbsp;&lt;EM&gt;databricks-feature-lookup&lt;/EM&gt; as dependency, it is being added by default with my model even if I give my own requirements file while logging the model.&lt;/P&gt;&lt;P&gt;Thanks&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 26 Jun 2024 17:15:51 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/75874#M3393</guid>
      <dc:creator>NaeemS</dc:creator>
      <dc:date>2024-06-26T17:15:51Z</dc:date>
    </item>
    <item>
      <title>Re: Feature Store Model Serving endpoint</title>
      <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/79823#M3449</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/101172"&gt;@NaeemS&lt;/a&gt;, did you manage to find a fix? I tried to run the same setting that you have and Iam running into your same problem and I think I got the reason - JAVA_HOME is not set.&lt;/P&gt;&lt;P&gt;So the Feature Engineering library seems to be messing up with the Java installation. I'll try to look into the issue further.&lt;/P&gt;</description>
      <pubDate>Mon, 22 Jul 2024 08:28:20 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/79823#M3449</guid>
      <dc:creator>robbe</dc:creator>
      <dc:date>2024-07-22T08:28:20Z</dc:date>
    </item>
    <item>
      <title>Re: Feature Store Model Serving endpoint</title>
      <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/80259#M3508</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/102750"&gt;@robbe&lt;/a&gt;, I'm facing the same error like&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/101172"&gt;@NaeemS&lt;/a&gt;. I've deployed an endpoint for a RAG chain in Azure Databricks and at first, it worked well.&amp;nbsp;I've &lt;FONT color="#808080"&gt;&lt;STRONG&gt;set scale_to_zero_enabled=True&lt;/STRONG&gt;&lt;/FONT&gt;. The problem is: Sometimes, scaling up from zero works fine and sometimes it results in an error:&lt;/P&gt;&lt;PRE&gt;[b2rtc] File &lt;SPAN class=""&gt;"/opt/conda/envs/mlflow-env/lib/python3.10/concurrent/futures/_base.py"&lt;/SPAN&gt;, line &lt;SPAN class=""&gt;403&lt;/SPAN&gt;, &lt;SPAN class=""&gt;in&lt;/SPAN&gt; __get_result
[b2rtc] &lt;SPAN class=""&gt;raise&lt;/SPAN&gt; self._exception
[b2rtc] File &lt;SPAN class=""&gt;"/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/mlflowserving/scoring_server/__init__.py"&lt;/SPAN&gt;, line &lt;SPAN class=""&gt;182&lt;/SPAN&gt;, &lt;SPAN class=""&gt;in&lt;/SPAN&gt; get_model_option_or_exit
[b2rtc] self.model = self.model_future.result()
[b2rtc] File &lt;SPAN class=""&gt;"/opt/conda/envs/mlflow-env/lib/python3.10/concurrent/futures/_base.py"&lt;/SPAN&gt;, line &lt;SPAN class=""&gt;451&lt;/SPAN&gt;, &lt;SPAN class=""&gt;in&lt;/SPAN&gt; result
[b2rtc] &lt;SPAN class=""&gt;return&lt;/SPAN&gt; self.__get_result()
[b2rtc] File &lt;SPAN class=""&gt;"/opt/conda/envs/mlflow-env/lib/python3.10/concurrent/futures/_base.py"&lt;/SPAN&gt;, line &lt;SPAN class=""&gt;403&lt;/SPAN&gt;, &lt;SPAN class=""&gt;in&lt;/SPAN&gt; __get_result
[b2rtc] &lt;SPAN class=""&gt;raise&lt;/SPAN&gt; self._exception
[b2rtc] File &lt;SPAN class=""&gt;"/opt/conda/envs/mlflow-env/lib/python3.10/site-packages/mlflowserving/scoring_server/__init__.py"&lt;/SPAN&gt;, line &lt;SPAN class=""&gt;182&lt;/SPAN&gt;, &lt;SPAN class=""&gt;in&lt;/SPAN&gt; get_model_option_or_exit
[b2rtc] self.model = self.model_future.result()
[b2rtc] File &lt;SPAN class=""&gt;"/opt/conda/envs/mlflow-env/lib/python3.10/concurrent/futures/_base.py"&lt;/SPAN&gt;, line &lt;SPAN class=""&gt;451&lt;/SPAN&gt;, &lt;SPAN class=""&gt;in&lt;/SPAN&gt; result
...
...&lt;/PRE&gt;&lt;P&gt;&lt;SPAN&gt;This goes on and on, but it's the same six lines over and over again.&amp;nbsp;&lt;/SPAN&gt;It's also interesting that in spite of the exception in the logs, the serving endpoint state never changes to &lt;FONT color="#808080"&gt;&lt;STRONG&gt;Error&lt;/STRONG&gt;&lt;/FONT&gt;, but remains &lt;FONT color="#808080"&gt;&lt;STRONG&gt;Ready (Scaling from zero)&lt;/STRONG&gt;&lt;/FONT&gt; instead.&lt;/P&gt;&lt;P&gt;My requirements are:&lt;/P&gt;&lt;PRE&gt;mlflow==2.14.1
cloudpickle==2.0.0
databricks-feature-engineering==0.2.1
databricks-sdk==0.12.0
databricks-vectorsearch==0.22
entrypoints==0.4
langchain-community==0.2.6
langchain==0.2.6
numpy==1.23.5
packaging==23.2
pandas==1.5.3
psutil==5.9.0
pydantic==1.10.6
pyyaml==6.0
requests==2.28.1
tornado==6.1&amp;nbsp;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jul 2024 06:32:43 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/80259#M3508</guid>
      <dc:creator>damselfly20</dc:creator>
      <dc:date>2024-07-24T06:32:43Z</dc:date>
    </item>
    <item>
      <title>Re: Feature Store Model Serving endpoint</title>
      <link>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/81173#M3534</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/113489"&gt;@damselfly20&lt;/a&gt; unfortunately I can't help much with that as I've never worked with RAGs. Are you sure it's the same error though? &lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/101172"&gt;@NaeemS&lt;/a&gt;'s and my errors seems to be Java related and yours MLflow related.&lt;/P&gt;</description>
      <pubDate>Tue, 30 Jul 2024 15:08:14 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/feature-store-model-serving-endpoint/m-p/81173#M3534</guid>
      <dc:creator>robbe</dc:creator>
      <dc:date>2024-07-30T15:08:14Z</dc:date>
    </item>
  </channel>
</rss>

