12-10-2024 01:47 PM
I am developing a Databricks application using the Streamlit package. I was able to get a "hello world" app deployed successfully, but now I am trying to pass data that exists in the dbfs on the same instance. I try to read a csv saved to the dbfs but get a file not found error. I am assuming there is a virtual environment being setup during deployment and there is an additional step I need to do to configure the path. Thanks in advance.
12-10-2024 02:39 PM
Hello Adam,
So you are running something similar to:
import streamlit as st
import pandas as pd
# Path to the CSV file in DBFS
file_path = '/dbfs/path/to/your/file.csv'
# Read the CSV file
df = pd.read_csv(file_path)
# Display the dataframe in Streamlit
st.write(df)
And it is resulting in this file not found issue?
12-11-2024 05:44 AM
Yes, exactly. To add more context, the read_csv() line works if I just run it in a notebook with the same path, but it does not work once I try and deploy the application.
01-14-2025 12:48 AM
Hello Walter,
did you have the possibility to look into this?
12-11-2024 06:35 AM
What if you try to list the file using dbutils.fs.ls("dbfs:/mnt/path/to/data") does it list it?
12-11-2024 07:02 AM
Well, I can't even use dbutils in the app. When I try that, I get a NameError name 'dbutils' is not defined error. Again, this just works in a notebook but not the app.
If I try and do this:
12-23-2024 01:41 AM
Have you found a solution? As far as I can see, the apps run in an environment where DBFS is not mounted.
12-23-2024 04:22 AM
The environment where the app runs does not have the following directories in the root folder:
01-14-2025 01:41 AM - edited 01-14-2025 01:42 AM
Ensure that the environment where Streamlit is running has access to the DBFS paths. This is typically handled by Databricks, but if you are running Streamlit outside of Databricks, you may need to configure access to DBFS.
If you are running Streamlit outside of Databricks, consider using Databricks Connect to interact with Databricks resources
01-14-2025 01:48 AM
That's the point. I am running streamlit from a Databricks App, so I was wondering if they can propose the "right" way to access DBFS. Or if they think that the way to exchange data should be using an SQL Warehouse.
01-17-2025 11:39 AM
I have the identical problem in Databricks Apps. I have tried...
None of these methods worked for me and cannot use Apps until I have a solution for this.
07-04-2025 09:01 AM
I’m also trying to find a solution. I’ve created folders and placed files in them, which I'm accessing from a Streamlit app. The app launches without any issues, but it’s unable to access the files stored in DBFS. I’d really appreciate it if you could share your solution. Thank you!
07-04-2025 04:33 PM
Why can't you have the files loaded from UC volumes and reference them from the Apps?
3 weeks ago
I would love to know why apps cannot reference UC volumes as well.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now