I am building a Streamlit-based app on Databricks that allows users to:
- Upload Excel scenario files
- Store them in DBFS (e.g.,/FileStore/SCO/scenarios/)
- Trigger a simulation/optimization model using the uploaded/stored file as input to the model
- Store the outputs (logs, visualizations, result summaries) in DBFS or Unity Catalog
Issue: When deploying this as a custom Databricks App , I run into restrictions:
- I cannot write to /dbfs/...directly from inside the app
- dbutils and the Databricks CLI are not available
I wan to know :
- What is the correct, supported way to write files from a Databricks App particularly streamline into DBFS?
- Is there a built-in API method or recommended workaround?