<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: writing to blob storage from databricks (parquet format) in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/writing-to-blob-storage-from-databricks-parquet-format/m-p/83081#M36841</link>
    <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/116049"&gt;@yagmur&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;Did you assigned required permission to service principal on storage account?&lt;BR /&gt;And make sure you're configuring connection to storage account in proper way. You should have something similiar to the code below:&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;configs = {
            "fs.azure.account.auth.type"              : "OAuth",
            "fs.azure.account.oauth.provider.type"    : "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
            "fs.azure.account.oauth2.client.id"       :"your_client_id",
            "fs.azure.account.oauth2.client.secret"   : "your_secret",
            "fs.azure.account.oauth2.client.endpoint" : f"https://login.microsoftonline.com/your_tenant_id/oauth2/token"
        };

dbutils.fs.mount(
                    source = f"abfss://your_container@your_storage_account.dfs.core.windows.net/",
                    mount_point = f"/mnt/your_mount_point_name",
                    extra_configs = configs
                    );
             &lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;Anyway, nowadays you should use Unity Catalog and configure storage using Storage Credential&lt;/P&gt;</description>
    <pubDate>Thu, 15 Aug 2024 11:20:29 GMT</pubDate>
    <dc:creator>szymon_dybczak</dc:creator>
    <dc:date>2024-08-15T11:20:29Z</dc:date>
    <item>
      <title>writing to blob storage from databricks (parquet format)</title>
      <link>https://community.databricks.com/t5/data-engineering/writing-to-blob-storage-from-databricks-parquet-format/m-p/83049#M36826</link>
      <description>&lt;P&gt;Hi,&amp;nbsp;&lt;BR /&gt;I am supposed to create transformation notebook. But i am having trouble when i am trying to save the transformed file into blob storage.&amp;nbsp;&lt;BR /&gt;I didn't use any layer, just the layer which i am performing transformation in. if i use wasbs i receive different error, or if i use abfss i receive different. I tried to mount and save them into my dbfs, it worked but i couldnt carry them into blob storage.&amp;nbsp;&lt;BR /&gt;When i use connection (azure services princble) i am getting the first error i mentioned. (the error pictures are attached)&lt;BR /&gt;Thanks in advance&lt;BR /&gt;&lt;BR /&gt;PS: If i try to save to raw container which i mounted it and where it read the files, i will be saved. I dont understand even though i have connected without mount at the same time why i am having that error message?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 15 Aug 2024 07:23:58 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/writing-to-blob-storage-from-databricks-parquet-format/m-p/83049#M36826</guid>
      <dc:creator>yagmur</dc:creator>
      <dc:date>2024-08-15T07:23:58Z</dc:date>
    </item>
    <item>
      <title>Re: writing to blob storage from databricks (parquet format)</title>
      <link>https://community.databricks.com/t5/data-engineering/writing-to-blob-storage-from-databricks-parquet-format/m-p/83081#M36841</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/116049"&gt;@yagmur&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;Did you assigned required permission to service principal on storage account?&lt;BR /&gt;And make sure you're configuring connection to storage account in proper way. You should have something similiar to the code below:&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;configs = {
            "fs.azure.account.auth.type"              : "OAuth",
            "fs.azure.account.oauth.provider.type"    : "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
            "fs.azure.account.oauth2.client.id"       :"your_client_id",
            "fs.azure.account.oauth2.client.secret"   : "your_secret",
            "fs.azure.account.oauth2.client.endpoint" : f"https://login.microsoftonline.com/your_tenant_id/oauth2/token"
        };

dbutils.fs.mount(
                    source = f"abfss://your_container@your_storage_account.dfs.core.windows.net/",
                    mount_point = f"/mnt/your_mount_point_name",
                    extra_configs = configs
                    );
             &lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;Anyway, nowadays you should use Unity Catalog and configure storage using Storage Credential&lt;/P&gt;</description>
      <pubDate>Thu, 15 Aug 2024 11:20:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/writing-to-blob-storage-from-databricks-parquet-format/m-p/83081#M36841</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2024-08-15T11:20:29Z</dc:date>
    </item>
    <item>
      <title>Re: writing to blob storage from databricks (parquet format)</title>
      <link>https://community.databricks.com/t5/data-engineering/writing-to-blob-storage-from-databricks-parquet-format/m-p/84299#M37178</link>
      <description>&lt;P&gt;Thanks for the effort but that is the same what i did. Still not working&lt;/P&gt;</description>
      <pubDate>Tue, 27 Aug 2024 10:00:14 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/writing-to-blob-storage-from-databricks-parquet-format/m-p/84299#M37178</guid>
      <dc:creator>yagmur</dc:creator>
      <dc:date>2024-08-27T10:00:14Z</dc:date>
    </item>
  </channel>
</rss>

