- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2024 06:37 AM
Hi, new to databricks here and wasn't able to find relevant info in the documentation.
Is it not possible to serve a model with a custom scoring script to an online endpoint on databricks to customise inference ? the customisation is related to incoming data and formatting output, which doesn't seem to be part of the configs of the serving endpoint https://docs.databricks.com/api/azure/workspace/servingendpoints/updateconfig
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2024 10:00 AM
If I'm understanding, all you really want to do is have a pre/post - process function running with your model, is that correct? If so, you can do this by using the MLflow pyfunc model. Something like they do here:
https://docs.databricks.com/en/machine-learning/model-serving/deploy-custom-models.html
Or in this notebook (possibly a better example): https://docs.databricks.com/en/_extras/notebooks/source/machine-learning/deploy-mlflow-pyfunc-model-...
Cheers.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-29-2024 10:00 AM
If I'm understanding, all you really want to do is have a pre/post - process function running with your model, is that correct? If so, you can do this by using the MLflow pyfunc model. Something like they do here:
https://docs.databricks.com/en/machine-learning/model-serving/deploy-custom-models.html
Or in this notebook (possibly a better example): https://docs.databricks.com/en/_extras/notebooks/source/machine-learning/deploy-mlflow-pyfunc-model-...
Cheers.

