<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Quota Limit Exhausted Error when Creating Data Ingestion with SQL Server Connector (Azure) in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136218#M50523</link>
    <description>&lt;P&gt;you for all of your assistance!&lt;/P&gt;</description>
    <pubDate>Mon, 27 Oct 2025 15:04:42 GMT</pubDate>
    <dc:creator>Adam_Borlase</dc:creator>
    <dc:date>2025-10-27T15:04:42Z</dc:date>
    <item>
      <title>Quota Limit Exhausted Error when Creating Data Ingestion with SQL Server Connector (Azure)</title>
      <link>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136180#M50510</link>
      <description>&lt;P&gt;Good Day all,&lt;BR /&gt;&lt;BR /&gt;I am having an issue with our first Data Ingestion Pipelines, I am wanting to connect to our Azure SQL Server with our Unity Connector (I can access the data in Unity Catalog).&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;When I am on Step 3 of the process (Source) when it is scanning the data in our Database I get an failure error about the Quota being exceeded. At this stage I have not selected which Virtual Machine should be used and I can see that an FS virtual machine has been allocated but I am still within our Quota limits.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;I am trying to find out either what Quota needs to be increased or checked or if there is anything else that can be done here to create the Data Ingestion Pipeline so I can start moving our team over to Delta Lake.&lt;BR /&gt;&lt;BR /&gt;I have included a picture of our Quotas and the error message I am getting at this stage of the creation.&lt;/P&gt;</description>
      <pubDate>Mon, 27 Oct 2025 12:58:04 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136180#M50510</guid>
      <dc:creator>Adam_Borlase</dc:creator>
      <dc:date>2025-10-27T12:58:04Z</dc:date>
    </item>
    <item>
      <title>Re: Quota Limit Exhausted Error when Creating Data Ingestion with SQL Server Connector (Azure)</title>
      <link>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136188#M50511</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/177328"&gt;@Adam_Borlase&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;I've checked and by default if you don't configure gateway setting yourself it will create VM with following types. So check if you didn't exceed quotas for&amp;nbsp;Standard_F4s or for&amp;nbsp;Standard_E4d_v4 family of VMs.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="szymon_dybczak_0-1761571861166.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/21060i3C82A840BABF9349/image-size/medium?v=v2&amp;amp;px=400" role="button" title="szymon_dybczak_0-1761571861166.png" alt="szymon_dybczak_0-1761571861166.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;My finding is in line with following thread with similar issue:&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.databricks.com/t5/get-started-discussions/issue-with-the-sql-server-ingestion-in-databricks-lakflow/td-p/122226" target="_blank"&gt;Issue with the SQL Server ingestion in Databricks ... - Databricks Community - 122226&lt;/A&gt;&lt;/P&gt;&lt;P&gt;If you want to have a greater control over which VMs are used for gateway just follow steps in official tutorial:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="szymon_dybczak_1-1761572064736.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/21061i48378BDB10A6EAFC/image-size/medium?v=v2&amp;amp;px=400" role="button" title="szymon_dybczak_1-1761572064736.png" alt="szymon_dybczak_1-1761572064736.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 27 Oct 2025 13:34:48 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136188#M50511</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2025-10-27T13:34:48Z</dc:date>
    </item>
    <item>
      <title>Re: Quota Limit Exhausted Error when Creating Data Ingestion with SQL Server Connector (Azure)</title>
      <link>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136210#M50519</link>
      <description>&lt;P&gt;This has helped a lot, thanks so much for your feedback.&lt;BR /&gt;&lt;BR /&gt;I have created a Compute Policy with specific Compute nodes for DLT but it seems that the Data Ingestion creation is still creating a DLT node with the unrestricted policy. How can I set it to use my policy as default?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 27 Oct 2025 14:44:02 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136210#M50519</guid>
      <dc:creator>Adam_Borlase</dc:creator>
      <dc:date>2025-10-27T14:44:02Z</dc:date>
    </item>
    <item>
      <title>Re: Quota Limit Exhausted Error when Creating Data Ingestion with SQL Server Connector (Azure)</title>
      <link>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136216#M50521</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/177328"&gt;@Adam_Borlase&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;To apply your policy you need to use API (either via REST API or databricks cli). They've mentioned it in docs. Unfortunately, currently there's no option to do it in UI.&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="szymon_dybczak_1-1761577248670.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/21071iBA586CA41C667157/image-size/medium?v=v2&amp;amp;px=400" role="button" title="szymon_dybczak_1-1761577248670.png" alt="szymon_dybczak_1-1761577248670.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Basically, you need to use Pipeline API and in clusters definition provide policy_id and set apply_policy_default_values to true:&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="szymon_dybczak_2-1761577345525.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/21072iE1F525F2577BF02A/image-size/medium?v=v2&amp;amp;px=400" role="button" title="szymon_dybczak_2-1761577345525.png" alt="szymon_dybczak_2-1761577345525.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.databricks.com/aws/en/ldp/configure-compute#cluster-policy" target="_blank" rel="noopener"&gt;Configure classic compute for Lakeflow Declarative Pipelines | Databricks on AWS&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 27 Oct 2025 15:03:31 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136216#M50521</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2025-10-27T15:03:31Z</dc:date>
    </item>
    <item>
      <title>Re: Quota Limit Exhausted Error when Creating Data Ingestion with SQL Server Connector (Azure)</title>
      <link>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136218#M50523</link>
      <description>&lt;P&gt;you for all of your assistance!&lt;/P&gt;</description>
      <pubDate>Mon, 27 Oct 2025 15:04:42 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/quota-limit-exhausted-error-when-creating-data-ingestion-with/m-p/136218#M50523</guid>
      <dc:creator>Adam_Borlase</dc:creator>
      <dc:date>2025-10-27T15:04:42Z</dc:date>
    </item>
  </channel>
</rss>

