<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Accessing Databricks Volumes from a Serving Endpoint Using a Custom Model Class in Unity Catalog in Machine Learning</title>
    <link>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/137453#M4404</link>
    <description>&lt;P&gt;Greetings&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/117422"&gt;@VELU1122&lt;/a&gt;&amp;nbsp;,&amp;nbsp; you’re correct that the Databricks Model Serving container is isolated, so you can’t rely on cluster-only affordances like mounts or executor-distributed file utilities. The reliable way to read from &lt;STRONG&gt;Unity Catalog (UC) Volumes&lt;/STRONG&gt; in a serving endpoint is to use the &lt;STRONG&gt;Databricks Files API / SDK&lt;/STRONG&gt; with an endpoint-injected credential, and address files by their UC Volumes path, for example &lt;CODE&gt;/Volumes/&amp;lt;catalog&amp;gt;/&amp;lt;schema&amp;gt;/&amp;lt;volume&amp;gt;/&amp;lt;relative_path&amp;gt;&lt;/CODE&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3 class="paragraph"&gt;What works from Model Serving&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;Use the &lt;STRONG&gt;Files REST API&lt;/STRONG&gt; or the &lt;STRONG&gt;Databricks SDK (WorkspaceClient.files)&lt;/STRONG&gt; to list, download, and upload files in UC Volumes with paths like &lt;CODE&gt;/Volumes/&amp;lt;catalog&amp;gt;/&amp;lt;schema&amp;gt;/&amp;lt;volume&amp;gt;/...&lt;/CODE&gt;. This is supported for managing and reading files directly from Volumes, and avoids the need for dbutils or mounts inside the serving container.&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Inject credentials into the serving container using &lt;STRONG&gt;environment variables backed by Databricks secrets&lt;/STRONG&gt;. Define &lt;CODE&gt;DATABRICKS_HOST&lt;/CODE&gt; as plain text and &lt;CODE&gt;DATABRICKS_TOKEN&lt;/CODE&gt; (or use OAuth for a service principal) as a secret in the endpoint config; then use the SDK to call the Files API at inference time.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Ensure the endpoint’s identity (the user or service principal that created the endpoint) has UC privileges (for example, &lt;STRONG&gt;READ FILES&lt;/STRONG&gt; on the Volume). Endpoint identity is fixed at creation and is used for UC access checks; if it lacks privileges, recreate the endpoint under an identity that has access.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;If your Volume is external, you can also access data via &lt;STRONG&gt;cloud URIs&lt;/STRONG&gt; (s3://, abfss://, gs://) as part of Volumes GA, but you still must provide cloud credentials in the serving container (for example via an instance profile on the endpoint or provider-specific auth). For many scenarios, the Files API / SDK is simpler and keeps governance in UC.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H3 class="paragraph"&gt;What doesn’t work in Model Serving&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;Avoid &lt;CODE&gt;dbutils.fs.mount&lt;/CODE&gt; or relying on FUSE-style local paths in serving containers; use Files API / SDK instead. Model Serving doesn’t run notebook executors and doesn’t support the same dbutils semantics; Volumes are intended for path-based governance and programmatic access via APIs and POSIX-like paths, not runtime mounts in serving.&lt;/LI&gt;
&lt;/UL&gt;
&lt;H3 class="paragraph"&gt;Recommended pattern 1. Configure environment variables with secrets on your endpoint:&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;In Serving UI or via REST/SDK, add: * &lt;CODE&gt;DATABRICKS_HOST&lt;/CODE&gt;: &lt;CODE&gt;https://&amp;lt;your-workspace-url&amp;gt;&lt;/CODE&gt; (plain text). * &lt;CODE&gt;DATABRICKS_TOKEN&lt;/CODE&gt;: &lt;CODE&gt;{{secrets/&amp;lt;scope&amp;gt;/&amp;lt;key&amp;gt;}}&lt;/CODE&gt; (secret). * Alternatively, use OAuth M2M for a service principal and inject &lt;CODE&gt;DATABRICKS_CLIENT_ID&lt;/CODE&gt; / &lt;CODE&gt;DATABRICKS_CLIENT_SECRET&lt;/CODE&gt; and fetch short-lived tokens at runtime, then call the Files API. This avoids PATs and is recommended for unattended endpoints.&lt;/LI&gt;
&lt;/UL&gt;
&lt;OL start="2"&gt;
&lt;LI&gt;From your custom &lt;CODE&gt;python_model&lt;/CODE&gt; class, read files with the SDK: ```python import os import io from databricks.sdk import WorkspaceClient&lt;/LI&gt;
&lt;/OL&gt;
&lt;DIV class="paragraph"&gt;class MicrosoftResnet50Model(mlflow.pyfunc.PythonModel): def load_context(self, context): host = os.environ["DATABRICKS_HOST"] token = os.environ["DATABRICKS_TOKEN"] # or build OAuth client and fetch an access token self.w = WorkspaceClient(host=host, token=token)&lt;/DIV&gt;
&lt;PRE&gt;&lt;CODE&gt;def _read_volume_file(self, path: str) -&amp;gt; bytes:
    # path like "/Volumes/&amp;lt;catalog&amp;gt;/&amp;lt;schema&amp;gt;/&amp;lt;volume&amp;gt;/images/cat.jpg"
    resp = self.w.files.download(path)  # returns a response with .contents (bytes)
    return resp.contents&lt;/CODE&gt;&lt;/PRE&gt;
&lt;DIV class="paragraph"&gt;def predict(self, context, model_input): # Example: model_input contains file names relative to your volume catalog, schema, volume = context.artifacts.get("uc_volume_ns", ("main", "default", "my_volume")) rel_path = model_input.get("relative_path") # e.g., "images/cat.jpg" volume_path = f"/Volumes/{catalog}/{schema}/{volume}/{rel_path}" # must include the volume name img_bytes = self._read_volume_file(volume_path) # ... open bytes with PIL, transform, run inference, return outputs ... # return predictions ```&lt;/DIV&gt;
&lt;OL start="3"&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Pass any constant namespace values or paths you need as &lt;STRONG&gt;artifacts/params&lt;/STRONG&gt; when logging the model or as &lt;STRONG&gt;endpoint environment variables&lt;/STRONG&gt;, so your class can construct the &lt;CODE&gt;/Volumes/...&lt;/CODE&gt; path at runtime.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;If you truly need direct cloud access (for external Volumes), configure the endpoint with an &lt;STRONG&gt;instance profile&lt;/STRONG&gt; or provider credentials and use the cloud SDK/URI. Otherwise, prefer the Files API route for simplicity and governance consistency.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;H3 class="paragraph"&gt;Why the errors occur&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;“No such file or directory” happens when using local filesystem paths that aren’t available in the serving container; UC Volume access in Serving should go through the Files API/SDK and Volume paths, not mounts.&lt;/LI&gt;
&lt;LI&gt;&lt;CODE&gt;dbutils&lt;/CODE&gt; is notebook/cluster-bound; Model Serving supports environment variables and secrets injection for external access, not dbutils mounts. Use the Files API / SDK instead of dbutils in serving.&lt;/LI&gt;
&lt;/UL&gt;
&lt;H3 class="paragraph"&gt;Alternative strategies&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;If the files are static assets required for inference (labels, templates, small configs), &lt;STRONG&gt;bundle them as MLflow model artifacts&lt;/STRONG&gt; at log time and access them via &lt;CODE&gt;context.artifacts&lt;/CODE&gt; rather than reaching out to Volumes during inference. This reduces I/O and removes external dependencies at serving time.&lt;/LI&gt;
&lt;LI&gt;For high-throughput batch scenarios that require broad data scans, consider &lt;STRONG&gt;Jobs on UC-enabled compute&lt;/STRONG&gt; reading from Volumes with Spark, and write outputs to tables; use Model Serving for low-latency point queries. Volumes are fully supported across Spark, SQL, dbutils, REST, CLI, and SDKs, so you can mix patterns as needed.&lt;/LI&gt;
&lt;/UL&gt;
&lt;H3 class="paragraph"&gt;Endpoint config snippets&lt;/H3&gt;
&lt;DIV class="paragraph"&gt;Create or update endpoint with secret-based env vars: &lt;CODE&gt;json
{
  "name": "uc-model-endpoint",
  "config": {
    "served_entities": [
      {
        "entity_name": "myCatalog.mySchema.myModel",
        "entity_version": "1",
        "workload_size": "Small",
        "scale_to_zero_enabled": true,
        "environment_vars": {
          "DATABRICKS_HOST": "https://&amp;lt;workspace-url&amp;gt;",
          "DATABRICKS_TOKEN": "{{secrets/my_scope/my_token_key}}"
        }
      }
    ]
  }
}
&lt;/CODE&gt;&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;Key references if you want to dig deeper: * Files API and SDK examples for Volumes, including REST paths: &lt;CODE&gt;/api/2.0/fs/files/Volumes/...&lt;/CODE&gt; and SDK usage in &lt;CODE&gt;WorkspaceClient.files&lt;/CODE&gt;.&lt;/DIV&gt;
&lt;UL&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Volumes GA capabilities and cloud URI access for external Volumes.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Volumes object model, path rules, and limitations (must include the volume name in paths, intended for path-based access).&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Serving endpoint identity and UC access implications.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;DIV class="paragraph"&gt;Hope this helps, Louis.&lt;/DIV&gt;</description>
    <pubDate>Mon, 03 Nov 2025 20:55:45 GMT</pubDate>
    <dc:creator>Louis_Frolio</dc:creator>
    <dc:date>2025-11-03T20:55:45Z</dc:date>
    <item>
      <title>Accessing Databricks Volumes from a Serving Endpoint Using a Custom Model Class in Unity Catalog</title>
      <link>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/93196#M3716</link>
      <description>&lt;P&gt;Hi everyone,&lt;/P&gt;&lt;P&gt;I’m looking for accessing Unity Catalog (UC) Volumes from a Databricks Serving Endpoint. Here’s my current setup:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;I have a custom AI model class for inference, which I logged into Unity Catalog using mlflow.pyfunc.log_model.&lt;/LI&gt;&lt;LI&gt;I’ve created a Serving Endpoint for this model.&lt;/LI&gt;&lt;/UL&gt;&lt;H3&gt;Challenges:&lt;/H3&gt;&lt;OL&gt;&lt;LI&gt;When trying to access UC Volumes directly from my custom class during inference, I get a "No such file or directory" error.&lt;/LI&gt;&lt;LI&gt;I attempted to mount the UC Volumes within the custom class using dbutils.fs.mount, but when logging the model (mlflow.pyfunc.log_model), I encountered an error that dbutils can’t be used in the Spark environment.&lt;/LI&gt;&lt;/OL&gt;&lt;H3&gt;Question:&lt;/H3&gt;&lt;P&gt;Since the Serving Endpoint runs in an isolated environment, how can I access Unity Catalog Volumes from within my custom model class during inference?&lt;/P&gt;&lt;P&gt;Any guidance on solving this issue or alternative methods to access UC Volumes from a Serving Endpoint would be greatly appreciated.&lt;/P&gt;&lt;P&gt;Thanks in advance&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 08 Oct 2024 22:39:15 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/93196#M3716</guid>
      <dc:creator>VELU1122</dc:creator>
      <dc:date>2024-10-08T22:39:15Z</dc:date>
    </item>
    <item>
      <title>Re: Accessing Databricks Volumes from a Serving Endpoint Using a Custom Model Class in Unity Catalog</title>
      <link>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/93197#M3717</link>
      <description>&lt;P&gt;Additionally, I log the model as shown below, with MicrosoftResnet50Model being my custom inference class with load_context and predict methods:&lt;BR /&gt;with mlflow.start_run():&lt;BR /&gt;model_info = mlflow.pyfunc.log_model(&lt;BR /&gt;REGISTERED_MODEL_NAME,&lt;BR /&gt;python_model=MicrosoftResnet50Model(),&lt;BR /&gt;input_example=api_input_example,&lt;BR /&gt;artifacts={"model_path": MODEL_PATH},&lt;BR /&gt;pip_requirements=[&lt;BR /&gt;f"transformers=={transformers.__version__}",&lt;BR /&gt;"torch==2.0.1"&lt;BR /&gt;],&lt;BR /&gt;signature=signature,&lt;BR /&gt;registered_model_name=f"{CATALOG}.{SCHEMA}.{REGISTERED_MODEL_NAME}"&lt;BR /&gt;)&lt;/P&gt;</description>
      <pubDate>Tue, 08 Oct 2024 22:39:44 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/93197#M3717</guid>
      <dc:creator>VELU1122</dc:creator>
      <dc:date>2024-10-08T22:39:44Z</dc:date>
    </item>
    <item>
      <title>Re: Accessing Databricks Volumes from a Serving Endpoint Using a Custom Model Class in Unity Catalog</title>
      <link>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/116277#M4038</link>
      <description>&lt;P&gt;Hey&amp;nbsp;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN class=""&gt;&lt;A class="" href="https://community.databricks.com/t5/user/viewprofilepage/user-id/117422" target="_self"&gt;&lt;SPAN class=""&gt;VELU1122,&lt;/SPAN&gt;&lt;/A&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;did you find a solution for it. We are struggling with the same problem currently.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Thanks&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 23 Apr 2025 06:36:18 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/116277#M4038</guid>
      <dc:creator>Lloetters</dc:creator>
      <dc:date>2025-04-23T06:36:18Z</dc:date>
    </item>
    <item>
      <title>Re: Accessing Databricks Volumes from a Serving Endpoint Using a Custom Model Class in Unity Catalog</title>
      <link>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/137453#M4404</link>
      <description>&lt;P&gt;Greetings&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/117422"&gt;@VELU1122&lt;/a&gt;&amp;nbsp;,&amp;nbsp; you’re correct that the Databricks Model Serving container is isolated, so you can’t rely on cluster-only affordances like mounts or executor-distributed file utilities. The reliable way to read from &lt;STRONG&gt;Unity Catalog (UC) Volumes&lt;/STRONG&gt; in a serving endpoint is to use the &lt;STRONG&gt;Databricks Files API / SDK&lt;/STRONG&gt; with an endpoint-injected credential, and address files by their UC Volumes path, for example &lt;CODE&gt;/Volumes/&amp;lt;catalog&amp;gt;/&amp;lt;schema&amp;gt;/&amp;lt;volume&amp;gt;/&amp;lt;relative_path&amp;gt;&lt;/CODE&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3 class="paragraph"&gt;What works from Model Serving&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;Use the &lt;STRONG&gt;Files REST API&lt;/STRONG&gt; or the &lt;STRONG&gt;Databricks SDK (WorkspaceClient.files)&lt;/STRONG&gt; to list, download, and upload files in UC Volumes with paths like &lt;CODE&gt;/Volumes/&amp;lt;catalog&amp;gt;/&amp;lt;schema&amp;gt;/&amp;lt;volume&amp;gt;/...&lt;/CODE&gt;. This is supported for managing and reading files directly from Volumes, and avoids the need for dbutils or mounts inside the serving container.&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Inject credentials into the serving container using &lt;STRONG&gt;environment variables backed by Databricks secrets&lt;/STRONG&gt;. Define &lt;CODE&gt;DATABRICKS_HOST&lt;/CODE&gt; as plain text and &lt;CODE&gt;DATABRICKS_TOKEN&lt;/CODE&gt; (or use OAuth for a service principal) as a secret in the endpoint config; then use the SDK to call the Files API at inference time.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Ensure the endpoint’s identity (the user or service principal that created the endpoint) has UC privileges (for example, &lt;STRONG&gt;READ FILES&lt;/STRONG&gt; on the Volume). Endpoint identity is fixed at creation and is used for UC access checks; if it lacks privileges, recreate the endpoint under an identity that has access.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;If your Volume is external, you can also access data via &lt;STRONG&gt;cloud URIs&lt;/STRONG&gt; (s3://, abfss://, gs://) as part of Volumes GA, but you still must provide cloud credentials in the serving container (for example via an instance profile on the endpoint or provider-specific auth). For many scenarios, the Files API / SDK is simpler and keeps governance in UC.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;H3 class="paragraph"&gt;What doesn’t work in Model Serving&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;Avoid &lt;CODE&gt;dbutils.fs.mount&lt;/CODE&gt; or relying on FUSE-style local paths in serving containers; use Files API / SDK instead. Model Serving doesn’t run notebook executors and doesn’t support the same dbutils semantics; Volumes are intended for path-based governance and programmatic access via APIs and POSIX-like paths, not runtime mounts in serving.&lt;/LI&gt;
&lt;/UL&gt;
&lt;H3 class="paragraph"&gt;Recommended pattern 1. Configure environment variables with secrets on your endpoint:&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;In Serving UI or via REST/SDK, add: * &lt;CODE&gt;DATABRICKS_HOST&lt;/CODE&gt;: &lt;CODE&gt;https://&amp;lt;your-workspace-url&amp;gt;&lt;/CODE&gt; (plain text). * &lt;CODE&gt;DATABRICKS_TOKEN&lt;/CODE&gt;: &lt;CODE&gt;{{secrets/&amp;lt;scope&amp;gt;/&amp;lt;key&amp;gt;}}&lt;/CODE&gt; (secret). * Alternatively, use OAuth M2M for a service principal and inject &lt;CODE&gt;DATABRICKS_CLIENT_ID&lt;/CODE&gt; / &lt;CODE&gt;DATABRICKS_CLIENT_SECRET&lt;/CODE&gt; and fetch short-lived tokens at runtime, then call the Files API. This avoids PATs and is recommended for unattended endpoints.&lt;/LI&gt;
&lt;/UL&gt;
&lt;OL start="2"&gt;
&lt;LI&gt;From your custom &lt;CODE&gt;python_model&lt;/CODE&gt; class, read files with the SDK: ```python import os import io from databricks.sdk import WorkspaceClient&lt;/LI&gt;
&lt;/OL&gt;
&lt;DIV class="paragraph"&gt;class MicrosoftResnet50Model(mlflow.pyfunc.PythonModel): def load_context(self, context): host = os.environ["DATABRICKS_HOST"] token = os.environ["DATABRICKS_TOKEN"] # or build OAuth client and fetch an access token self.w = WorkspaceClient(host=host, token=token)&lt;/DIV&gt;
&lt;PRE&gt;&lt;CODE&gt;def _read_volume_file(self, path: str) -&amp;gt; bytes:
    # path like "/Volumes/&amp;lt;catalog&amp;gt;/&amp;lt;schema&amp;gt;/&amp;lt;volume&amp;gt;/images/cat.jpg"
    resp = self.w.files.download(path)  # returns a response with .contents (bytes)
    return resp.contents&lt;/CODE&gt;&lt;/PRE&gt;
&lt;DIV class="paragraph"&gt;def predict(self, context, model_input): # Example: model_input contains file names relative to your volume catalog, schema, volume = context.artifacts.get("uc_volume_ns", ("main", "default", "my_volume")) rel_path = model_input.get("relative_path") # e.g., "images/cat.jpg" volume_path = f"/Volumes/{catalog}/{schema}/{volume}/{rel_path}" # must include the volume name img_bytes = self._read_volume_file(volume_path) # ... open bytes with PIL, transform, run inference, return outputs ... # return predictions ```&lt;/DIV&gt;
&lt;OL start="3"&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Pass any constant namespace values or paths you need as &lt;STRONG&gt;artifacts/params&lt;/STRONG&gt; when logging the model or as &lt;STRONG&gt;endpoint environment variables&lt;/STRONG&gt;, so your class can construct the &lt;CODE&gt;/Volumes/...&lt;/CODE&gt; path at runtime.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;If you truly need direct cloud access (for external Volumes), configure the endpoint with an &lt;STRONG&gt;instance profile&lt;/STRONG&gt; or provider credentials and use the cloud SDK/URI. Otherwise, prefer the Files API route for simplicity and governance consistency.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;H3 class="paragraph"&gt;Why the errors occur&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;“No such file or directory” happens when using local filesystem paths that aren’t available in the serving container; UC Volume access in Serving should go through the Files API/SDK and Volume paths, not mounts.&lt;/LI&gt;
&lt;LI&gt;&lt;CODE&gt;dbutils&lt;/CODE&gt; is notebook/cluster-bound; Model Serving supports environment variables and secrets injection for external access, not dbutils mounts. Use the Files API / SDK instead of dbutils in serving.&lt;/LI&gt;
&lt;/UL&gt;
&lt;H3 class="paragraph"&gt;Alternative strategies&lt;/H3&gt;
&lt;UL&gt;
&lt;LI class="paragraph"&gt;If the files are static assets required for inference (labels, templates, small configs), &lt;STRONG&gt;bundle them as MLflow model artifacts&lt;/STRONG&gt; at log time and access them via &lt;CODE&gt;context.artifacts&lt;/CODE&gt; rather than reaching out to Volumes during inference. This reduces I/O and removes external dependencies at serving time.&lt;/LI&gt;
&lt;LI&gt;For high-throughput batch scenarios that require broad data scans, consider &lt;STRONG&gt;Jobs on UC-enabled compute&lt;/STRONG&gt; reading from Volumes with Spark, and write outputs to tables; use Model Serving for low-latency point queries. Volumes are fully supported across Spark, SQL, dbutils, REST, CLI, and SDKs, so you can mix patterns as needed.&lt;/LI&gt;
&lt;/UL&gt;
&lt;H3 class="paragraph"&gt;Endpoint config snippets&lt;/H3&gt;
&lt;DIV class="paragraph"&gt;Create or update endpoint with secret-based env vars: &lt;CODE&gt;json
{
  "name": "uc-model-endpoint",
  "config": {
    "served_entities": [
      {
        "entity_name": "myCatalog.mySchema.myModel",
        "entity_version": "1",
        "workload_size": "Small",
        "scale_to_zero_enabled": true,
        "environment_vars": {
          "DATABRICKS_HOST": "https://&amp;lt;workspace-url&amp;gt;",
          "DATABRICKS_TOKEN": "{{secrets/my_scope/my_token_key}}"
        }
      }
    ]
  }
}
&lt;/CODE&gt;&lt;/DIV&gt;
&lt;DIV class="paragraph"&gt;Key references if you want to dig deeper: * Files API and SDK examples for Volumes, including REST paths: &lt;CODE&gt;/api/2.0/fs/files/Volumes/...&lt;/CODE&gt; and SDK usage in &lt;CODE&gt;WorkspaceClient.files&lt;/CODE&gt;.&lt;/DIV&gt;
&lt;UL&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Volumes GA capabilities and cloud URI access for external Volumes.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Volumes object model, path rules, and limitations (must include the volume name in paths, intended for path-based access).&lt;/DIV&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;DIV class="paragraph"&gt;Serving endpoint identity and UC access implications.&lt;/DIV&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;DIV class="paragraph"&gt;Hope this helps, Louis.&lt;/DIV&gt;</description>
      <pubDate>Mon, 03 Nov 2025 20:55:45 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/137453#M4404</guid>
      <dc:creator>Louis_Frolio</dc:creator>
      <dc:date>2025-11-03T20:55:45Z</dc:date>
    </item>
    <item>
      <title>Re: Accessing Databricks Volumes from a Serving Endpoint Using a Custom Model Class in Unity Catalog</title>
      <link>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/140776#M4454</link>
      <description>&lt;DIV id="bodyDisplay_1" class="lia-message-body lia-component-message-view-widget-body lia-component-body-signature-highlight-escalation lia-component-message-view-widget-body-signature-highlight-escalation"&gt;
&lt;DIV class="lia-message-body-content"&gt;
&lt;P&gt;Serverless Model Serving does not mount the UC Volumes FUSE path (/Volumes), so references to “/Volumes/…” inside a custom pyfunc’s model code will fail at container build or runtime. The correct pattern is to package any required files (like your GGUF) into the model artifact at log time and then load them from context.artifacts[...] in load_context() during serving.&lt;/P&gt;
&lt;P&gt;Ref Doc -&amp;nbsp;&lt;A href="https://docs.databricks.com/aws/en/machine-learning/model-serving/model-serving-custom-artifacts" target="_blank" rel="nofollow noopener noreferrer"&gt;https://docs.databricks.com/aws/en/machine-learning/model-serving/model-serving-custom-artifacts&lt;/A&gt;&lt;/P&gt;
&lt;/DIV&gt;
&lt;/DIV&gt;</description>
      <pubDate>Mon, 01 Dec 2025 19:03:06 GMT</pubDate>
      <guid>https://community.databricks.com/t5/machine-learning/accessing-databricks-volumes-from-a-serving-endpoint-using-a/m-p/140776#M4454</guid>
      <dc:creator>iyashk-DB</dc:creator>
      <dc:date>2025-12-01T19:03:06Z</dc:date>
    </item>
  </channel>
</rss>

