<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic List all files in a Blob Container in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/list-all-files-in-a-blob-container/m-p/27968#M19806</link>
    <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;I am trying to find a way to list all files, and related file sizes, in all folders and all sub folders. I guess these are called blobs, in the Databricks world. Anyway, I can easily list all files, and related file sizes, in one single folder, but I can't come up with Python code that lists ALL files and the sizes of each of these files. Just for reference, on a desktop machine the code would look like this.&lt;/P&gt;
&lt;P&gt;import sys, os &lt;/P&gt;
&lt;P&gt;root = "C:\\path_here\\" &lt;/P&gt;
&lt;P&gt;path = os.path.join(root, "targetdirectory") &lt;/P&gt;
&lt;P&gt;for path, subdirs, files in os.walk(root): &lt;/P&gt;
&lt;P&gt; for name in files: &lt;/P&gt;
&lt;P&gt; print(os.path.join(path, name))&lt;/P&gt;
&lt;P&gt;Maybe there is a non-Python way to do this. I'd like to get an inventory of all the files in Blob Storage, any way I can.&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Thu, 27 Jun 2019 18:09:19 GMT</pubDate>
    <dc:creator>asher</dc:creator>
    <dc:date>2019-06-27T18:09:19Z</dc:date>
    <item>
      <title>List all files in a Blob Container</title>
      <link>https://community.databricks.com/t5/data-engineering/list-all-files-in-a-blob-container/m-p/27968#M19806</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;I am trying to find a way to list all files, and related file sizes, in all folders and all sub folders. I guess these are called blobs, in the Databricks world. Anyway, I can easily list all files, and related file sizes, in one single folder, but I can't come up with Python code that lists ALL files and the sizes of each of these files. Just for reference, on a desktop machine the code would look like this.&lt;/P&gt;
&lt;P&gt;import sys, os &lt;/P&gt;
&lt;P&gt;root = "C:\\path_here\\" &lt;/P&gt;
&lt;P&gt;path = os.path.join(root, "targetdirectory") &lt;/P&gt;
&lt;P&gt;for path, subdirs, files in os.walk(root): &lt;/P&gt;
&lt;P&gt; for name in files: &lt;/P&gt;
&lt;P&gt; print(os.path.join(path, name))&lt;/P&gt;
&lt;P&gt;Maybe there is a non-Python way to do this. I'd like to get an inventory of all the files in Blob Storage, any way I can.&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 27 Jun 2019 18:09:19 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/list-all-files-in-a-blob-container/m-p/27968#M19806</guid>
      <dc:creator>asher</dc:creator>
      <dc:date>2019-06-27T18:09:19Z</dc:date>
    </item>
    <item>
      <title>Re: List all files in a Blob Container</title>
      <link>https://community.databricks.com/t5/data-engineering/list-all-files-in-a-blob-container/m-p/27969#M19807</link>
      <description>&lt;P&gt;from azure.storage.blob import BlockBlobService &lt;/P&gt;&lt;P&gt; block_blob_service = BlockBlobService(account_name='your_acct_name', account_key='your_acct_key') &lt;/P&gt;&lt;P&gt;mylist = [] generator = block_blob_service.list_blobs('rawdata') &lt;/P&gt;&lt;P&gt;for blob in generator: &lt;/P&gt;&lt;P&gt; mylist.append(blob.name)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 14 Oct 2019 20:38:26 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/list-all-files-in-a-blob-container/m-p/27969#M19807</guid>
      <dc:creator>asher</dc:creator>
      <dc:date>2019-10-14T20:38:26Z</dc:date>
    </item>
  </channel>
</rss>

