- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-20-2025 02:07 AM
Hello,
I am trying to deploy a gradio app (app.py) in databricks, but I am having problem accessing data stored in a volume in unity catalog. It seems like that I cannot access the data using path like "/Volumes/catalog/schema.../my_data", which works fine while running in a notebook in workspace with unity catalog enabled. I have also tried using path like "dbfs:/..." and added the volume as a resource, but none of them is working.
I would like to ask if I am doing it correctly, or is there other way to reach the data (They are model weights saved as .pt).
One method that comes to my mind is using delta sharing to share the volume. However, it seems like that delta sharing is only available for tabular data.
Thanks.
- Labels:
-
Delta Sharing
-
Unity Catalog
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-22-2025 09:49 AM
Hi @kktim
You're facing a common challenge with Gradio apps in Databricks.
The issue is that Gradio apps run in a different execution context than notebooks,
so they don't have the same access to Unity Catalog volumes.
Here are several approaches to solve this:
Solution 1: Copy Files to DBFS During App Initialization
This is often the most reliable approach:
Solution 2: Use Databricks File System API
Access files programmatically using the Databricks API:
Solution 3: Mount Volume to DBFS
If you have admin access, you can mount the volume:
Solution 4: Environment Variables and Startup Script
Create a startup script that sets up the environment:
Solution 5: Use MLflow Model Registry
If these are ML models, consider using MLflow:
The key insight is that Gradio apps run in a containerized environment that doesn't have the same direct access to Unity Catalog volumes as notebooks do.
By copying the files during app initialization, you ensure they're available in the app's execution context.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-22-2025 09:04 AM
Unlike notebooks, Databricks Apps does not support mounting Unity Catalog volumes and directly reading and writing files. As this code snippet demonstrates, each file needs to be downloaded to the app compute before being able to manipulate it.