<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Azure Databricks Serverless – SFTP Connectivity (external provider) in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/azure-databricks-serverless-sftp-connectivity-external-provider/m-p/156277#M54396</link>
    <description>&lt;P&gt;&lt;STRONG&gt;Recommendation:&lt;/STRONG&gt; if the external SFTP vendor &lt;STRONG&gt;strictly requires source-IP allowlisting&lt;/STRONG&gt;, the most reliable path is usually &lt;STRONG&gt;classic compute with your own NAT gateway/static public IP&lt;/STRONG&gt;. For &lt;STRONG&gt;serverless&lt;/STRONG&gt;, Azure Databricks can reach public external resources via &lt;STRONG&gt;NAT IPs&lt;/STRONG&gt;, but obtaining a &lt;STRONG&gt;deterministic allowlistable outbound IP set&lt;/STRONG&gt; is not a simple self-serve workflow today and may require &lt;STRONG&gt;account-team/private-preview support&lt;/STRONG&gt;.&lt;/P&gt;
&lt;H3&gt;Option 1 — &lt;STRONG&gt;Recommended&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P&gt;Use &lt;STRONG&gt;classic compute&lt;/STRONG&gt; (ideally VNet-injected) with &lt;STRONG&gt;your own NAT gateway / static public IP&lt;/STRONG&gt;, and have the SFTP provider whitelist that IP. Databricks docs explicitly recommend stable egress IP for external systems when allowlisting is required.&lt;/P&gt;
&lt;H3&gt;Option 2&lt;/H3&gt;
&lt;P&gt;Stay on &lt;STRONG&gt;serverless&lt;/STRONG&gt;, but involve your &lt;STRONG&gt;Databricks account team&lt;/STRONG&gt; to obtain/enable the &lt;STRONG&gt;serverless outbound IP / stable NAT IP&lt;/STRONG&gt; path. Azure docs note that serverless reaches non-private resources using &lt;STRONG&gt;NAT IPs&lt;/STRONG&gt;, and the newer outbound-IP mechanism is in &lt;STRONG&gt;preview&lt;/STRONG&gt; and delivered via a JSON endpoint, while old static lists are being retired.&lt;/P&gt;
&lt;H3&gt;Option 3&lt;/H3&gt;
&lt;P&gt;If the provider can expose the SFTP endpoint through &lt;STRONG&gt;Azure Private Link / a private endpoint path&lt;/STRONG&gt; (for example, via an Azure-hosted front end or your VNet), use an &lt;STRONG&gt;NCC private endpoint&lt;/STRONG&gt; from serverless. This is the cleanest serverless-native option, but it is only practical if the endpoint can be presented as an Azure/VNet private target rather than a generic internet SFTP host.&lt;/P&gt;
&lt;P&gt;A few practical notes:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;The &lt;STRONG&gt;Lakeflow Connect SFTP connector&lt;/STRONG&gt; is supported on &lt;STRONG&gt;serverless and classic&lt;/STRONG&gt; (DBR &lt;STRONG&gt;17.3+&lt;/STRONG&gt;), and the docs specifically say the SFTP server must allow either the &lt;STRONG&gt;Databricks VPC/VNet range&lt;/STRONG&gt; for classic or the &lt;STRONG&gt;stable IPs&lt;/STRONG&gt; for serverless.&lt;/LI&gt;
&lt;LI&gt;If you use &lt;STRONG&gt;serverless egress control&lt;/STRONG&gt;, you can explicitly allow the SFTP &lt;STRONG&gt;FQDN&lt;/STRONG&gt;, but that controls Databricks outbound policy; it does &lt;STRONG&gt;not&lt;/STRONG&gt; replace the vendor’s inbound source-IP allowlist requirement.&lt;/LI&gt;
&lt;/UL&gt;</description>
    <pubDate>Wed, 06 May 2026 15:40:49 GMT</pubDate>
    <dc:creator>Lu_Wang_ENB_DBX</dc:creator>
    <dc:date>2026-05-06T15:40:49Z</dc:date>
    <item>
      <title>Azure Databricks Serverless – SFTP Connectivity (external provider)</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-serverless-sftp-connectivity-external-provider/m-p/155002#M54169</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;To establish connectivity from Azure Databricks serverless compute &amp;nbsp;to an external SFTP provider hosted outside organization (external provider).&lt;/P&gt;&lt;P&gt;when i searched i figured out one way is whitelisting ip,&lt;/P&gt;&lt;P&gt;1). The SFTP provider requires IP whitelisting for inbound connections- how to&amp;nbsp; identify or obtain egress IP addresses for the serverless compute environment.&lt;/P&gt;&lt;P&gt;2). Any Recommended alternatives or supported approaches to enable&amp;nbsp; connectivity from serverless compute to an external SFTP server&lt;/P&gt;&lt;P&gt;Kindly assist&lt;/P&gt;</description>
      <pubDate>Tue, 21 Apr 2026 04:01:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-serverless-sftp-connectivity-external-provider/m-p/155002#M54169</guid>
      <dc:creator>ittzzmalind</dc:creator>
      <dc:date>2026-04-21T04:01:34Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks Serverless – SFTP Connectivity (external provider)</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-serverless-sftp-connectivity-external-provider/m-p/155010#M54172</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/223614"&gt;@ittzzmalind&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;How to do it is described in following section in docs:&lt;/P&gt;&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/resources/ip-domain-region#control-plane-ip-addresses" target="_blank" rel="noopener"&gt;IP addresses and domains for Azure Databricks services and assets - Azure Databricks | Microsoft Learn&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="szymon_dybczak_0-1776753988956.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/26203iE6FAE2BFCA5DB71A/image-size/medium?v=v2&amp;amp;px=400" role="button" title="szymon_dybczak_0-1776753988956.png" alt="szymon_dybczak_0-1776753988956.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Keep in mind that&amp;nbsp;&lt;SPAN&gt;Azure Databricks might update outbound IPs as often as once &lt;STRONG&gt;every 30 days&lt;/STRONG&gt;. Updated IPs become active as soon as 60 days after publication. After new Azure Databricks regions become available, their active IPs are published to the file.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;If the above answer was helpful, please consider marking it as accepted solution.&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 21 Apr 2026 06:53:48 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-serverless-sftp-connectivity-external-provider/m-p/155010#M54172</guid>
      <dc:creator>szymon_dybczak</dc:creator>
      <dc:date>2026-04-21T06:53:48Z</dc:date>
    </item>
    <item>
      <title>Re: Azure Databricks Serverless – SFTP Connectivity (external provider)</title>
      <link>https://community.databricks.com/t5/data-engineering/azure-databricks-serverless-sftp-connectivity-external-provider/m-p/156277#M54396</link>
      <description>&lt;P&gt;&lt;STRONG&gt;Recommendation:&lt;/STRONG&gt; if the external SFTP vendor &lt;STRONG&gt;strictly requires source-IP allowlisting&lt;/STRONG&gt;, the most reliable path is usually &lt;STRONG&gt;classic compute with your own NAT gateway/static public IP&lt;/STRONG&gt;. For &lt;STRONG&gt;serverless&lt;/STRONG&gt;, Azure Databricks can reach public external resources via &lt;STRONG&gt;NAT IPs&lt;/STRONG&gt;, but obtaining a &lt;STRONG&gt;deterministic allowlistable outbound IP set&lt;/STRONG&gt; is not a simple self-serve workflow today and may require &lt;STRONG&gt;account-team/private-preview support&lt;/STRONG&gt;.&lt;/P&gt;
&lt;H3&gt;Option 1 — &lt;STRONG&gt;Recommended&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P&gt;Use &lt;STRONG&gt;classic compute&lt;/STRONG&gt; (ideally VNet-injected) with &lt;STRONG&gt;your own NAT gateway / static public IP&lt;/STRONG&gt;, and have the SFTP provider whitelist that IP. Databricks docs explicitly recommend stable egress IP for external systems when allowlisting is required.&lt;/P&gt;
&lt;H3&gt;Option 2&lt;/H3&gt;
&lt;P&gt;Stay on &lt;STRONG&gt;serverless&lt;/STRONG&gt;, but involve your &lt;STRONG&gt;Databricks account team&lt;/STRONG&gt; to obtain/enable the &lt;STRONG&gt;serverless outbound IP / stable NAT IP&lt;/STRONG&gt; path. Azure docs note that serverless reaches non-private resources using &lt;STRONG&gt;NAT IPs&lt;/STRONG&gt;, and the newer outbound-IP mechanism is in &lt;STRONG&gt;preview&lt;/STRONG&gt; and delivered via a JSON endpoint, while old static lists are being retired.&lt;/P&gt;
&lt;H3&gt;Option 3&lt;/H3&gt;
&lt;P&gt;If the provider can expose the SFTP endpoint through &lt;STRONG&gt;Azure Private Link / a private endpoint path&lt;/STRONG&gt; (for example, via an Azure-hosted front end or your VNet), use an &lt;STRONG&gt;NCC private endpoint&lt;/STRONG&gt; from serverless. This is the cleanest serverless-native option, but it is only practical if the endpoint can be presented as an Azure/VNet private target rather than a generic internet SFTP host.&lt;/P&gt;
&lt;P&gt;A few practical notes:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;The &lt;STRONG&gt;Lakeflow Connect SFTP connector&lt;/STRONG&gt; is supported on &lt;STRONG&gt;serverless and classic&lt;/STRONG&gt; (DBR &lt;STRONG&gt;17.3+&lt;/STRONG&gt;), and the docs specifically say the SFTP server must allow either the &lt;STRONG&gt;Databricks VPC/VNet range&lt;/STRONG&gt; for classic or the &lt;STRONG&gt;stable IPs&lt;/STRONG&gt; for serverless.&lt;/LI&gt;
&lt;LI&gt;If you use &lt;STRONG&gt;serverless egress control&lt;/STRONG&gt;, you can explicitly allow the SFTP &lt;STRONG&gt;FQDN&lt;/STRONG&gt;, but that controls Databricks outbound policy; it does &lt;STRONG&gt;not&lt;/STRONG&gt; replace the vendor’s inbound source-IP allowlist requirement.&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Wed, 06 May 2026 15:40:49 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/azure-databricks-serverless-sftp-connectivity-external-provider/m-p/156277#M54396</guid>
      <dc:creator>Lu_Wang_ENB_DBX</dc:creator>
      <dc:date>2026-05-06T15:40:49Z</dc:date>
    </item>
  </channel>
</rss>

