โ04-13-2023 04:41 AM
When I try to serve a model stored with FeatureStoreClient().log_model using the feature-store-online-example-cosmosdb tutorial Notebook, I get errors suggesting that the primary key schema is not configured properly. However, if I look in the Feature Store, it lists
Primary Keys: wiine_id (LONG),
which seems to be correct. The experiment and model are also tracked correctly in MLflow, just the serving doesn't work (both in the Legacy Model Serving and when making a serving endpoint). I also checked this tutorial on PluralSight and it seems I'm not missing any steps in setting up the serving. I have run the tutorial Notebook on a Runtime 12.2 LTS ML cluster.
Does anyone have an idea how I could resolve this? Thanks much!
+ echo 'GUNICORN_CMD_ARGS=--timeout 63 --workers 4 '
GUNICORN_CMD_ARGS=--timeout 63 --workers 4
+ mlflow models serve --no-conda -m /tmp/tmp67u15n1s/model -h unix:/tmp/1.sock -p1
2023/04/13 11:17:05 INFO mlflow.models.flavor_backend_registry: Selected backend for flavor 'python_function'
2023/04/13 11:17:05 INFO mlflow.pyfunc.backend: === Running command 'exec gunicorn --timeout=60 -b unix:/tmp/1.sock:1 -w 1 ${GUNICORN_CMD_ARGS} -- mlflow.pyfunc.scoring_server.wsgi:app'
[2023-04-13 11:17:06 +0000] [2419] [INFO] Starting gunicorn 20.1.0
[2023-04-13 11:17:06 +0000] [2419] [INFO] Listening at: unix:/tmp/1.sock:1 (2419)
[2023-04-13 11:17:06 +0000] [2419] [INFO] Using worker: sync
[2023-04-13 11:17:06 +0000] [2420] [INFO] Booting worker with pid: 2420
[2023-04-13 11:17:06 +0000] [2421] [INFO] Booting worker with pid: 2421
[2023-04-13 11:17:06 +0000] [2422] [INFO] Booting worker with pid: 2422
[2023-04-13 11:17:06 +0000] [2423] [INFO] Booting worker with pid: 2423
[2023-04-13 11:17:07 +0000] [2421] [ERROR] Exception in worker process
Traceback (most recent call last):
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/gunicorn/arbiter.py", line 589, in spawn_worker
worker.init_process()
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/gunicorn/workers/base.py", line 134, in init_process
self.load_wsgi()
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/gunicorn/workers/base.py", line 146, in load_wsgi
self.wsgi = self.app.wsgi()
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/gunicorn/app/base.py", line 67, in wsgi
self.callable = self.load()
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/gunicorn/app/wsgiapp.py", line 58, in load
return self.load_wsgiapp()
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/gunicorn/app/wsgiapp.py", line 48, in load_wsgiapp
return util.import_app(self.app_uri)
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/gunicorn/util.py", line 359, in import_app
mod = importlib.import_module(module)
File "/databricks/conda/envs/model-1/lib/python3.9/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 855, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/mlflow/pyfunc/scoring_server/wsgi.py", line 6, in <module>
app = scoring_server.init(load_model(os.environ[scoring_server._SERVER_MODEL_PATH]))
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/mlflow/pyfunc/__init__.py", line 582, in load_model
model_impl = importlib.import_module(conf[MAIN])._load_pyfunc(data_path)
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/databricks/feature_store/mlflow_model.py", line 634, in _load_pyfunc
return _FeatureStoreModelWrapper(path)
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/databricks/feature_store/mlflow_model.py", line 120, in __init__
self.ft_to_lookup_client = self._create_lookup_clients(self.ft_metadata)
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/databricks/feature_store/mlflow_model.py", line 189, in _create_lookup_clients
ft_to_lookup_client[ft] = OnlineLookupClient(
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/databricks/feature_store/online_lookup_client.py", line 174, in __init__
self.lookup_engine = self._generate_lookup_engine(
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/databricks/feature_store/online_lookup_client.py", line 189, in _generate_lookup_engine
return OnlineLookupClient._generate_lookup_engine_databricks(
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/databricks/feature_store/online_lookup_client.py", line 227, in _generate_lookup_engine_databricks
return generate_lookup_cosmosdb_engine(online_feature_table, creds)
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/databricks/feature_store/online_lookup_client.py", line 82, in generate_lookup_cosmosdb_engine
return LookupCosmosDbEngine(online_feature_table, authorization_key=creds)
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/databricks/feature_store/lookup_engine/lookup_cosmosdb_engine.py", line 68, in __init__
self._validate_online_feature_table()
File "/databricks/conda/envs/model-1/lib/python3.9/site-packages/databricks/feature_store/lookup_engine/lookup_cosmosdb_engine.py", line 80, in _validate_online_feature_table
raise ValueError(
ValueError: Online Table online_feature_store_example.feature_store_online_wiine_features primary key schema is not configured properly.
โ04-14-2023 12:13 AM
Hello @Thomas Michielsenโ , this error seems to occur when you may have created the table yourself.
You must use publish_table() to create the table in the online store. Do not manually create a database or container inside Cosmos DB.
publish_table() does that for you automatically.
If you create a table without using
publish_table() , the schema might be incompatible and the write command will fail.
Saw from the link - https://learn.microsoft.com/en-us/azure/databricks/_extras/notebooks/source/machine-learning/feature...
Can you please confirm if you are using publish_table() ?
Thanks & Regards,
Nandini
โ04-14-2023 12:13 AM
Hello @Thomas Michielsenโ , this error seems to occur when you may have created the table yourself.
You must use publish_table() to create the table in the online store. Do not manually create a database or container inside Cosmos DB.
publish_table() does that for you automatically.
If you create a table without using
publish_table() , the schema might be incompatible and the write command will fail.
Saw from the link - https://learn.microsoft.com/en-us/azure/databricks/_extras/notebooks/source/machine-learning/feature...
Can you please confirm if you are using publish_table() ?
Thanks & Regards,
Nandini
โ04-14-2023 04:54 AM
Thanks @Nandini Nโ. I had first manually created a database with the same name using azure.cosmos.CosmosClient. Because I hadn't deleted that database, I think something went wrong when I used 'publish_table' in the Databricks example notebook. After I had manually deleted that database with azure.cosmos.CosmosClient.delete_database and reran the notebook, the error went away.
โ04-14-2023 06:20 AM
Thank you for sharing @Thomas Michielsenโ , I am glad I could help. Kudos!
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group