How to Pass Data to a Databricks App?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-10-2024 01:47 PM
I am developing a Databricks application using the Streamlit package. I was able to get a "hello world" app deployed successfully, but now I am trying to pass data that exists in the dbfs on the same instance. I try to read a csv saved to the dbfs but get a file not found error. I am assuming there is a virtual environment being setup during deployment and there is an additional step I need to do to configure the path. Thanks in advance.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-10-2024 02:39 PM
Hello Adam,
So you are running something similar to:
import streamlit as st
import pandas as pd
# Path to the CSV file in DBFS
file_path = '/dbfs/path/to/your/file.csv'
# Read the CSV file
df = pd.read_csv(file_path)
# Display the dataframe in Streamlit
st.write(df)
And it is resulting in this file not found issue?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2024 05:44 AM
Yes, exactly. To add more context, the read_csv() line works if I just run it in a notebook with the same path, but it does not work once I try and deploy the application.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-14-2025 12:48 AM
Hello Walter,
did you have the possibility to look into this?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2024 06:35 AM
What if you try to list the file using dbutils.fs.ls("dbfs:/mnt/path/to/data") does it list it?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-11-2024 07:02 AM
Well, I can't even use dbutils in the app. When I try that, I get a NameError name 'dbutils' is not defined error. Again, this just works in a notebook but not the app.
If I try and do this:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-23-2024 01:41 AM
Have you found a solution? As far as I can see, the apps run in an environment where DBFS is not mounted.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-23-2024 04:22 AM
The environment where the app runs does not have the following directories in the root folder:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-14-2025 01:41 AM - edited 01-14-2025 01:42 AM
Ensure that the environment where Streamlit is running has access to the DBFS paths. This is typically handled by Databricks, but if you are running Streamlit outside of Databricks, you may need to configure access to DBFS.
If you are running Streamlit outside of Databricks, consider using Databricks Connect to interact with Databricks resources
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-14-2025 01:48 AM
That's the point. I am running streamlit from a Databricks App, so I was wondering if they can propose the "right" way to access DBFS. Or if they think that the way to exchange data should be using an SQL Warehouse.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-17-2025 11:39 AM
I have the identical problem in Databricks Apps. I have tried...
- Reading from DBFS path using mount version `/dbfs/myfolder/myfile` and protocol `dbfs:/myfolder/myfile`
- Reading from Unity Volumes `/Volumes/mycatalog/mydatabase/myfolder/myfile`
- Also made sure that the App principal had rights to read from the specific Unity Volume
- Reading from S3 path `s3://mybucket/mypath/myfile`
None of these methods worked for me and cannot use Apps until I have a solution for this.

