cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

Accessing Databricks Volumes from a Serving Endpoint Using a Custom Model Class in Unity Catalog

VELU1122
New Contributor II

Hi everyone,

I’m looking for accessing Unity Catalog (UC) Volumes from a Databricks Serving Endpoint. Here’s my current setup:

  • I have a custom AI model class for inference, which I logged into Unity Catalog using mlflow.pyfunc.log_model.
  • I’ve created a Serving Endpoint for this model.

Challenges:

  1. When trying to access UC Volumes directly from my custom class during inference, I get a "No such file or directory" error.
  2. I attempted to mount the UC Volumes within the custom class using dbutils.fs.mount, but when logging the model (mlflow.pyfunc.log_model), I encountered an error that dbutils can’t be used in the Spark environment.

Question:

Since the Serving Endpoint runs in an isolated environment, how can I access Unity Catalog Volumes from within my custom model class during inference?

Any guidance on solving this issue or alternative methods to access UC Volumes from a Serving Endpoint would be greatly appreciated.

Thanks in advance


1 REPLY 1

VELU1122
New Contributor II

Additionally, I log the model as shown below, with MicrosoftResnet50Model being my custom inference class with load_context and predict methods:
with mlflow.start_run():
model_info = mlflow.pyfunc.log_model(
REGISTERED_MODEL_NAME,
python_model=MicrosoftResnet50Model(),
input_example=api_input_example,
artifacts={"model_path": MODEL_PATH},
pip_requirements=[
f"transformers=={transformers.__version__}",
"torch==2.0.1"
],
signature=signature,
registered_model_name=f"{CATALOG}.{SCHEMA}.{REGISTERED_MODEL_NAME}"
)

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group