List all files in a Blob Container
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-27-2019 11:09 AM
I am trying to find a way to list all files, and related file sizes, in all folders and all sub folders. I guess these are called blobs, in the Databricks world. Anyway, I can easily list all files, and related file sizes, in one single folder, but I can't come up with Python code that lists ALL files and the sizes of each of these files. Just for reference, on a desktop machine the code would look like this.
import sys, os
root = "C:\\path_here\\"
path = os.path.join(root, "targetdirectory")
for path, subdirs, files in os.walk(root):
for name in files:
print(os.path.join(path, name))
Maybe there is a non-Python way to do this. I'd like to get an inventory of all the files in Blob Storage, any way I can.
- Labels:
-
Blob-storage
-
Folders
-
List
-
Python3
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-14-2019 01:38 PM
from azure.storage.blob import BlockBlobService
block_blob_service = BlockBlobService(account_name='your_acct_name', account_key='your_acct_key')
mylist = [] generator = block_blob_service.list_blobs('rawdata')
for blob in generator:
mylist.append(blob.name)

