<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Azure Databricks - Exporting data frame to external volume in Administration &amp; Architecture</title>
    <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-exporting-data-frame-to-external-volume/m-p/147294#M4814</link>
    <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;For UC external volumes on ADLS Gen2, Databricks often needs to generate a user delegation SAS to access the storage path from compute securely. Generating that SAS requires first calling Get User Delegation Key.&lt;BR /&gt;If the Access Connector / Managed Identity / Service Principal only has container-level “Storage Blob Data Reader/Contributor” permissions, it may still fail because Get User Delegation Key typically requires permissions at the storage account scope (not only container scope).&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;1) Identify the identity used by the External Location.&lt;/P&gt;&lt;P&gt;In Unity Catalog, your external location points to a storage credential. That storage credential is backed by an Access Connector for Azure Databricks (managed identity) or a service principal.&lt;/P&gt;&lt;P&gt;You need to grant Azure Storage permissions to that identity.&lt;/P&gt;&lt;P&gt;2) Grant both:&lt;/P&gt;&lt;P&gt;Storage Blob Data Contributor (or Owner) — for read/write/list on blobs&lt;BR /&gt;Storage Blob Delegator — to allow generating user delegation keys (this is the usual missing one)&lt;/P&gt;&lt;P&gt;Scope matters: assign at the Storage Account level (or higher). Container-level role assignments frequently aren’t enough for delegation keys.&lt;BR /&gt;Recommended RBAC:&lt;BR /&gt;At Storage Account scope: Storage Blob Delegator&lt;BR /&gt;At Container (or filesystem) scope: Storage Blob Data Contributor (or narrower if your org requires)&lt;/P&gt;</description>
    <pubDate>Fri, 06 Feb 2026 21:32:32 GMT</pubDate>
    <dc:creator>nayan_wylde</dc:creator>
    <dc:date>2026-02-06T21:32:32Z</dc:date>
    <item>
      <title>Azure Databricks - Exporting data frame to external volume</title>
      <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-exporting-data-frame-to-external-volume/m-p/146922#M4813</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;I am reading a Delta table and exporting it to an external volume. The Unity Catalog external volume points to an Azure Data Lake Storage container.&lt;/P&gt;&lt;P&gt;When I run the code below, I encounter the error message shown below. (When I export the data to a managed volume, the operation completes successfully.)&lt;BR /&gt;&lt;BR /&gt;Could you please help?&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Converting to Pandas...
Creating Excel in memory...
Writing to: /Volumes/dev_catalog/silver_schema/external_volume1/outputfolder/competitor_data.xlsx
❌ Error writing to volume: An error occurred while calling o499.cp.
: com.databricks.sql.managedcatalog.acl.UnauthorizedAccessException: PERMISSION_DENIED: Request for user delegation key is not authorized. Details: None
	at com.databricks.sql.managedcatalog.client.ErrorDetailsHandlerImpl.wrapServiceException(ErrorDetailsHandler.scala:119)
	at com.databricks.sql.managedcatalog.client.ErrorDetailsHandlerImpl.wrapServiceException$(ErrorDetailsHandler.scala:88)&lt;/LI-CODE&gt;&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="python"&gt;!pip install openpyxl

%restart_python

df = spark.read.table('dev_catalog.silver_schema.silver_table')

# For Excel files:
def save_as_excel_to_external_volume(df, volume_path, filename="data.xlsx", sheet_name="Sheet1"):
    """Save DataFrame as Excel using dbutils.fs"""
    import pandas as pd
    from io import BytesIO
    import base64
    
    volume_path = volume_path.rstrip('/')
    full_path = f"{volume_path}/{filename}"
    
    print("Converting to Pandas...")
    pandas_df = df.toPandas()
    
    print("Creating Excel in memory...")
    excel_buffer = BytesIO()
    pandas_df.to_excel(excel_buffer, index=False, sheet_name=sheet_name, engine='openpyxl')
    excel_bytes = excel_buffer.getvalue()
    
    print(f"Writing to: {full_path}")
    try:
        # For binary files, write to temp then copy
        temp_path = f"/tmp/{filename}"
        with open(temp_path, 'wb') as f:
            f.write(excel_bytes)
        
        # Copy from temp to volume using dbutils
        dbutils.fs.cp(f"file:{temp_path}", full_path)
        
        # Clean up temp
        dbutils.fs.rm(f"file:{temp_path}")
        
        print(f"✓ Successfully saved to {full_path}")
        return full_path
    except Exception as e:
        print(f"❌ Error writing to volume: {e}")
        raise
        

volume_path = "/Volumes/dev_catalog/silver_schema/external_volume1/outputfolder/"

save_as_excel_to_external_volume(df, volume_path, "competitor_data.xlsx", "CompetitorData")&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;Thanks&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 05 Feb 2026 16:58:48 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/azure-databricks-exporting-data-frame-to-external-volume/m-p/146922#M4813</guid>
      <dc:creator>lion_king_84</dc:creator>
      <dc:date>2026-02-05T16:58:48Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks - Exporting data frame to external volume</title>
      <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-exporting-data-frame-to-external-volume/m-p/147294#M4814</link>
      <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;For UC external volumes on ADLS Gen2, Databricks often needs to generate a user delegation SAS to access the storage path from compute securely. Generating that SAS requires first calling Get User Delegation Key.&lt;BR /&gt;If the Access Connector / Managed Identity / Service Principal only has container-level “Storage Blob Data Reader/Contributor” permissions, it may still fail because Get User Delegation Key typically requires permissions at the storage account scope (not only container scope).&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;1) Identify the identity used by the External Location.&lt;/P&gt;&lt;P&gt;In Unity Catalog, your external location points to a storage credential. That storage credential is backed by an Access Connector for Azure Databricks (managed identity) or a service principal.&lt;/P&gt;&lt;P&gt;You need to grant Azure Storage permissions to that identity.&lt;/P&gt;&lt;P&gt;2) Grant both:&lt;/P&gt;&lt;P&gt;Storage Blob Data Contributor (or Owner) — for read/write/list on blobs&lt;BR /&gt;Storage Blob Delegator — to allow generating user delegation keys (this is the usual missing one)&lt;/P&gt;&lt;P&gt;Scope matters: assign at the Storage Account level (or higher). Container-level role assignments frequently aren’t enough for delegation keys.&lt;BR /&gt;Recommended RBAC:&lt;BR /&gt;At Storage Account scope: Storage Blob Delegator&lt;BR /&gt;At Container (or filesystem) scope: Storage Blob Data Contributor (or narrower if your org requires)&lt;/P&gt;</description>
      <pubDate>Fri, 06 Feb 2026 21:32:32 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/azure-databricks-exporting-data-frame-to-external-volume/m-p/147294#M4814</guid>
      <dc:creator>nayan_wylde</dc:creator>
      <dc:date>2026-02-06T21:32:32Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks - Exporting data frame to external volume</title>
      <link>https://community.databricks.com/t5/administration-architecture/azure-databricks-exporting-data-frame-to-external-volume/m-p/147861#M4827</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/214701"&gt;@lion_king_84&lt;/a&gt;&amp;nbsp;are we missing &lt;SPAN&gt;write volume&amp;nbsp;permissions on this schema in UC?&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 10 Feb 2026 09:43:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/administration-architecture/azure-databricks-exporting-data-frame-to-external-volume/m-p/147861#M4827</guid>
      <dc:creator>saurabh18cs</dc:creator>
      <dc:date>2026-02-10T09:43:08Z</dc:date>
    </item>
  </channel>
</rss>

