<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How to use Azure Data lake as a storage location to store the Delta Live Tables? in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/how-to-use-azure-data-lake-as-a-storage-location-to-store-the/m-p/31290#M22772</link>
    <description>&lt;P&gt;I am trying write data into Azure Datalake. I am reading files from Azure Blob Storage however when I try to create the Delta Live Table to Azure Datalake I get error the following error&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.InvalidConfigurationValueException: Invalid configuration value detected for fs.azure.account.key&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I also tried to create the DLT in Azure Blob Storage, but it doesnt seem to recognize the container in the Azure Storage Account&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="image"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2165i2577293378CD4038/image-size/large?v=v2&amp;amp;px=999" role="button" title="image" alt="image" /&gt;&lt;/span&gt;I just specify the storage location via the DLT UI, not sure if there is any additional parameters to be configured to make it work.&lt;/P&gt;</description>
    <pubDate>Thu, 20 Jan 2022 14:32:50 GMT</pubDate>
    <dc:creator>SM</dc:creator>
    <dc:date>2022-01-20T14:32:50Z</dc:date>
    <item>
      <title>How to use Azure Data lake as a storage location to store the Delta Live Tables?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-azure-data-lake-as-a-storage-location-to-store-the/m-p/31290#M22772</link>
      <description>&lt;P&gt;I am trying write data into Azure Datalake. I am reading files from Azure Blob Storage however when I try to create the Delta Live Table to Azure Datalake I get error the following error&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;B&gt;shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.contracts.exceptions.InvalidConfigurationValueException: Invalid configuration value detected for fs.azure.account.key&lt;/B&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I also tried to create the DLT in Azure Blob Storage, but it doesnt seem to recognize the container in the Azure Storage Account&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper" image-alt="image"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/2165i2577293378CD4038/image-size/large?v=v2&amp;amp;px=999" role="button" title="image" alt="image" /&gt;&lt;/span&gt;I just specify the storage location via the DLT UI, not sure if there is any additional parameters to be configured to make it work.&lt;/P&gt;</description>
      <pubDate>Thu, 20 Jan 2022 14:32:50 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-azure-data-lake-as-a-storage-location-to-store-the/m-p/31290#M22772</guid>
      <dc:creator>SM</dc:creator>
      <dc:date>2022-01-20T14:32:50Z</dc:date>
    </item>
    <item>
      <title>Re: How to use Azure Data lake as a storage location to store the Delta Live Tables?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-azure-data-lake-as-a-storage-location-to-store-the/m-p/31291#M22773</link>
      <description>&lt;P&gt;Hi there,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;My name is Piper, and I'm a moderator for Databricks. Welcome to the community! Thank you for your question. Let's give your peers a chance to respond before we circle back to this.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thank you for your patience!&lt;/P&gt;</description>
      <pubDate>Thu, 20 Jan 2022 16:40:58 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-azure-data-lake-as-a-storage-location-to-store-the/m-p/31291#M22773</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2022-01-20T16:40:58Z</dc:date>
    </item>
    <item>
      <title>Re: How to use Azure Data lake as a storage location to store the Delta Live Tables?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-azure-data-lake-as-a-storage-location-to-store-the/m-p/31293#M22775</link>
      <description>&lt;P&gt;@Kaniz Fatma​&amp;nbsp;I don't think you quite understand the question. I'm running into the same problem. When creating a Delta Live Table pipeline to write to Azure Data Lake Storage (abfss://etc...) as the Storage Location, the pipeline fails with the error @Shikha Mathew​&amp;nbsp;mentioned in the original post. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Typically, when you're creating a cluster to process a regular notebook, you have to configure the various OAuth configurations in the cluster's advanced settings. I've tried adding those configurations to the Delta Live Table pipeline, but I get the same error. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Also, we're not using the fs.azure.account.key settings since we're using OAuth authentication to ADLS.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Any suggestions?&lt;/P&gt;</description>
      <pubDate>Wed, 05 Oct 2022 20:29:54 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-azure-data-lake-as-a-storage-location-to-store-the/m-p/31293#M22775</guid>
      <dc:creator>RThornton</dc:creator>
      <dc:date>2022-10-05T20:29:54Z</dc:date>
    </item>
    <item>
      <title>Re: How to use Azure Data lake as a storage location to store the Delta Live Tables?</title>
      <link>https://community.databricks.com/t5/data-engineering/how-to-use-azure-data-lake-as-a-storage-location-to-store-the/m-p/31294#M22776</link>
      <description>&lt;P&gt;Same issue using abfss//... path and using it for the dlt pipeline storage location. @Robert Thornton​&amp;nbsp;, i mounted the abfss path as '/mnt/data' using the same service principal and secret as I used when i configured the oauth. I then changed my dlt pipelines &amp;amp; storage location to use /mnt/data/..path.. and it worked. &lt;/P&gt;</description>
      <pubDate>Fri, 25 Nov 2022 01:45:31 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/how-to-use-azure-data-lake-as-a-storage-location-to-store-the/m-p/31294#M22776</guid>
      <dc:creator>evogelpohl</dc:creator>
      <dc:date>2022-11-25T01:45:31Z</dc:date>
    </item>
  </channel>
</rss>

