<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Connect to azure data lake storage using databricks free edition in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128669#M48311</link>
    <description>&lt;P&gt;Thank you, I was able to do everything using the&amp;nbsp;&lt;SPAN&gt;packages for the Azure Data Lake Storage and Azure Identity client libraries on free edition!&lt;/SPAN&gt;&lt;/P&gt;</description>
    <pubDate>Sun, 17 Aug 2025 14:34:25 GMT</pubDate>
    <dc:creator>TalessRocha</dc:creator>
    <dc:date>2025-08-17T14:34:25Z</dc:date>
    <item>
      <title>Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127859#M48110</link>
      <description>&lt;P&gt;Hello guys, i'm using databricks free edition (serverless) and i am trying to connect to a azure data lake storage.&lt;/P&gt;&lt;P&gt;&lt;SPAN class=""&gt;The problem I'm having is that in the free edition we can't configure the cluster so I tried to make the connection via notebook using spark.conf.set but this configuration is not enabled in the free edition... and trying through the unity catalog interface only appears the option to add AWS and Cloudfire credentials. Is there any other way?&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN class=""&gt;I have tried use&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;dbutils.fs.&lt;/SPAN&gt;&lt;SPAN&gt;mount as well, but not available on free edition...&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 08 Aug 2025 23:42:58 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127859#M48110</guid>
      <dc:creator>TalessRocha</dc:creator>
      <dc:date>2025-08-08T23:42:58Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127884#M48114</link>
      <description>&lt;P&gt;hello&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/178634"&gt;@TalessRocha&lt;/a&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;In Free Edition dbfs is disabled. You should use Unity Catalog for that purpose anyway. DBFS is depracated pattern of interacting with storage.&lt;/P&gt;&lt;P&gt;So, to use volume perform following steps:&lt;/P&gt;&lt;P&gt;Go to Catalgos (1) -&amp;gt; Click workspace catalog (2) -&amp;gt; Click default schema -&amp;gt; Clikc Create button (3)&lt;/P&gt;&lt;P&gt;On the Create button (3) you will have an option to create volume. Pick a name and then create volume.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Khaja_Zaffer_0-1754729527108.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/18877iAA0E1BEA21014E22/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Khaja_Zaffer_0-1754729527108.png" alt="Khaja_Zaffer_0-1754729527108.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;reference :&amp;nbsp;&lt;SPAN&gt;szymon_dybczak&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Khaja_Zaffer_1-1754729527145.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/18876i732C590BD679F7E3/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Khaja_Zaffer_1-1754729527145.png" alt="Khaja_Zaffer_1-1754729527145.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;If you did that, your new volume should appear in Unity Catalog under default schema. Now you will have an option to upload file to Volume:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Khaja_Zaffer_2-1754729527157.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/18875iE8797774B747A2F6/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Khaja_Zaffer_2-1754729527157.png" alt="Khaja_Zaffer_2-1754729527157.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;And here's an example of how to read csv from volume into dataframe:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Khaja_Zaffer_3-1754729527126.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/18878i69AA2EB5B59CFEAA/image-size/medium?v=v2&amp;amp;px=400" role="button" title="Khaja_Zaffer_3-1754729527126.png" alt="Khaja_Zaffer_3-1754729527126.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 09 Aug 2025 08:54:10 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127884#M48114</guid>
      <dc:creator>Khaja_Zaffer</dc:creator>
      <dc:date>2025-08-09T08:54:10Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127900#M48116</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/178634"&gt;@TalessRocha&lt;/a&gt;,&amp;nbsp;if you were trying to connect to ADLS on your local machine, using python (for instance), you'd probably install the appropriate python packages to authenticate and then retrieve the containers &amp;amp; blobs/files. I don't see why we can't employ the same logic with the Free Edition.&lt;BR /&gt;&lt;BR /&gt;Here's official documentation on doing that:&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-python?tabs=azure-ad" target="_blank" rel="noopener"&gt;https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-python?tabs=azure-ad&lt;/A&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Pair that with a youtube video/tutorial and some AI assistance.&lt;BR /&gt;&lt;BR /&gt;I'd try locally using Python &amp;amp; then try on the Free Edition.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;A word of warning, don't connect to confidential information and bring it into the Databricks Free Edition. You should read the terms of service as to why. It goes without saying, it's not your storage/compute under the hood, right.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Only potential blocker here, in my opinion, is being unable to pip install the appropriate libraries on the free edition. Maybe that's worth trying before getting stuck into the weeds.&lt;BR /&gt;&lt;BR /&gt;Let me know how you end up resolving this, I'm interested.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;All the best,&lt;BR /&gt;BS&lt;/P&gt;</description>
      <pubDate>Sat, 09 Aug 2025 12:24:33 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127900#M48116</guid>
      <dc:creator>BS_THE_ANALYST</dc:creator>
      <dc:date>2025-08-09T12:24:33Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127985#M48129</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/173840"&gt;@Khaja_Zaffer&lt;/a&gt;&amp;nbsp;I don't think that was what&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/178634"&gt;@TalessRocha&lt;/a&gt;&amp;nbsp;was looking for. I think it's more around connecting to blob storage in Azure in the free edition.&lt;BR /&gt;&lt;BR /&gt;That's a great answer for how to import data into the free edition though!&lt;BR /&gt;&lt;BR /&gt;All the best,&lt;BR /&gt;BS&lt;/P&gt;</description>
      <pubDate>Mon, 11 Aug 2025 08:23:49 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127985#M48129</guid>
      <dc:creator>BS_THE_ANALYST</dc:creator>
      <dc:date>2025-08-11T08:23:49Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127997#M48133</link>
      <description>&lt;P&gt;I would always use the UC way. That is the standard for the Production workloads.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 11 Aug 2025 08:57:48 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/127997#M48133</guid>
      <dc:creator>sumitPanda</dc:creator>
      <dc:date>2025-08-11T08:57:48Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128000#M48135</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/121422"&gt;@sumitPanda&lt;/a&gt;,&amp;nbsp;is this supported in Free Edition Databricks? i.e. connecting to ADLS via UC. I think that's part of the constraints at play here.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Completely agree, in production, UC is the way. In fact, we'd normally not need to connect to ADLS if we're using Azure Databricks (and not using external tables). It does depend on the use case, of course.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;All the best,&lt;BR /&gt;BS&lt;/P&gt;</description>
      <pubDate>Mon, 11 Aug 2025 09:04:20 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128000#M48135</guid>
      <dc:creator>BS_THE_ANALYST</dc:creator>
      <dc:date>2025-08-11T09:04:20Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128004#M48136</link>
      <description>&lt;P&gt;Yes, thats the limitation. I checked through the documentation and its not mentioned anywhere strictly.&lt;/P&gt;&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/connect/storage/azure-storage" target="_blank"&gt;here is the details on how to connect : Connect to Azure Data Lake Storage and Blob Storage - Azure Databricks | Microsoft Learn&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 11 Aug 2025 09:12:05 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128004#M48136</guid>
      <dc:creator>sumitPanda</dc:creator>
      <dc:date>2025-08-11T09:12:05Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128045#M48145</link>
      <description>&lt;P&gt;The above information for dbfs related stuff : I took it from&amp;nbsp;&lt;SPAN&gt;szymon_dybczak(thank you)&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;sure next time I will read the things carefully and answer&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/146924"&gt;@BS_THE_ANALYST&lt;/a&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 11 Aug 2025 11:48:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128045#M48145</guid>
      <dc:creator>Khaja_Zaffer</dc:creator>
      <dc:date>2025-08-11T11:48:03Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128669#M48311</link>
      <description>&lt;P&gt;Thank you, I was able to do everything using the&amp;nbsp;&lt;SPAN&gt;packages for the Azure Data Lake Storage and Azure Identity client libraries on free edition!&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Sun, 17 Aug 2025 14:34:25 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128669#M48311</guid>
      <dc:creator>TalessRocha</dc:creator>
      <dc:date>2025-08-17T14:34:25Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128673#M48312</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/178634"&gt;@TalessRocha&lt;/a&gt;&amp;nbsp;thanks for getting back to us!&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Glad to hear you got it working, that's awesome.&amp;nbsp;Best of luck with your projects.&lt;BR /&gt;&lt;BR /&gt;All the best,&lt;BR /&gt;BS&lt;/P&gt;</description>
      <pubDate>Sun, 17 Aug 2025 14:38:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/128673#M48312</guid>
      <dc:creator>BS_THE_ANALYST</dc:creator>
      <dc:date>2025-08-17T14:38:54Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/135344#M50325</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/146924"&gt;@BS_THE_ANALYST&lt;/a&gt;&amp;nbsp;I also have a question here&amp;nbsp;&lt;A href="https://community.databricks.com/t5/data-engineering/cannot-import-pyspark-pipelines-module/td-p/135293" target="_blank"&gt;Cannot import pyspark.pipelines module - Databricks Community - 135293&lt;/A&gt;&amp;nbsp;in case you have any solution/suggestions.&lt;/P&gt;</description>
      <pubDate>Sat, 18 Oct 2025 18:39:47 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/135344#M50325</guid>
      <dc:creator>Saf4Databricks</dc:creator>
      <dc:date>2025-10-18T18:39:47Z</dc:date>
    </item>
    <item>
      <title>Re: Connect to azure data lake storage using databricks free edition</title>
      <link>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/156853#M54485</link>
      <description>&lt;P&gt;If you want to read from your Azure storage account using Databricks Free Edition, you can add a specific option when reading:&lt;BR /&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;spark.read.option("&lt;SPAN class=""&gt;fs.azure.account.key.&amp;lt;storage-account-name&amp;gt;.dfs.core.windows.net&lt;/SPAN&gt;&lt;SPAN class=""&gt;",&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; "your_storage_account_key")&lt;BR /&gt;&lt;BR /&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;I just tried and it worked &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt;&amp;nbsp; &amp;nbsp;&lt;/FONT&gt;&lt;/SPAN&gt;&lt;/FONT&gt;&lt;FONT face="courier new,courier"&gt;&lt;SPAN class=""&gt;&lt;FONT face="arial,helvetica,sans-serif"&gt;However, I could not achieve the same with dbutils.fs.ls("abfss://....") which works well in Azure Databricks, but I cannot find where I can pass the same option.&amp;nbsp;&lt;/FONT&gt;&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 13 May 2026 17:54:14 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/connect-to-azure-data-lake-storage-using-databricks-free-edition/m-p/156853#M54485</guid>
      <dc:creator>pjvi</dc:creator>
      <dc:date>2026-05-13T17:54:14Z</dc:date>
    </item>
  </channel>
</rss>

