How do you connect a folder path from your desktop to DB notebook?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-28-2022 08:13 AM
I have a folder with multiples excel files that contains information from different cost centers, these files get update every week , im trying to upload all these files to the DB notebook , is there a way to connect the path directly to the DBFS to avoid uploading file by file.
- Labels:
-
DB Notebook
-
Folder Path
-
Python
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-28-2022 07:43 PM
Hello, Thanks for your question.
You can mount a cloud object storage to dbfs and use them in a notebook. Please refer here.
It is not possible to mount a local folder from desktop to dbfs. But you should be able to use the Databricks CLI to copy the entire folder to dbfs.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-29-2022 04:04 AM
Exactly. Additionally, if you use Azur, you can install it for your windows Azure Storage Explorer, so you can drag and drop files.
If you are using One Drive / Sharepoint for your excels, you can achieve automation in copy files to Azure Data Lake Storage using tools like "Power Automate," "Logic Apps," and "Azure Data Factory".
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-17-2022 02:34 PM
Hi @ricard castañeda,
Just a friendly follow-up. Did any of the responses help you to resolve your question? if it did, please mark it as best. Otherwise, please let us know if you still need help.

