cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Azure Databricks Serverless โ€“ SFTP Connectivity (external provider)

ittzzmalind
New Contributor III

Hi,

To establish connectivity from Azure Databricks serverless compute  to an external SFTP provider hosted outside organization (external provider).

when i searched i figured out one way is whitelisting ip,

1). The SFTP provider requires IP whitelisting for inbound connections- how to  identify or obtain egress IP addresses for the serverless compute environment.

2). Any Recommended alternatives or supported approaches to enable  connectivity from serverless compute to an external SFTP server

Kindly assist

2 REPLIES 2

szymon_dybczak
Esteemed Contributor III

Hi @ittzzmalind ,

How to do it is described in following section in docs:

IP addresses and domains for Azure Databricks services and assets - Azure Databricks | Microsoft Lea...

szymon_dybczak_0-1776753988956.png

Keep in mind that Azure Databricks might update outbound IPs as often as once every 30 days. Updated IPs become active as soon as 60 days after publication. After new Azure Databricks regions become available, their active IPs are published to the file.

If the above answer was helpful, please consider marking it as accepted solution.

Lu_Wang_ENB_DBX
Databricks Employee
Databricks Employee

Recommendation: if the external SFTP vendor strictly requires source-IP allowlisting, the most reliable path is usually classic compute with your own NAT gateway/static public IP. For serverless, Azure Databricks can reach public external resources via NAT IPs, but obtaining a deterministic allowlistable outbound IP set is not a simple self-serve workflow today and may require account-team/private-preview support.

Option 1 โ€” Recommended

Use classic compute (ideally VNet-injected) with your own NAT gateway / static public IP, and have the SFTP provider whitelist that IP. Databricks docs explicitly recommend stable egress IP for external systems when allowlisting is required.

Option 2

Stay on serverless, but involve your Databricks account team to obtain/enable the serverless outbound IP / stable NAT IP path. Azure docs note that serverless reaches non-private resources using NAT IPs, and the newer outbound-IP mechanism is in preview and delivered via a JSON endpoint, while old static lists are being retired.

Option 3

If the provider can expose the SFTP endpoint through Azure Private Link / a private endpoint path (for example, via an Azure-hosted front end or your VNet), use an NCC private endpoint from serverless. This is the cleanest serverless-native option, but it is only practical if the endpoint can be presented as an Azure/VNet private target rather than a generic internet SFTP host.

A few practical notes:

  • The Lakeflow Connect SFTP connector is supported on serverless and classic (DBR 17.3+), and the docs specifically say the SFTP server must allow either the Databricks VPC/VNet range for classic or the stable IPs for serverless.
  • If you use serverless egress control, you can explicitly allow the SFTP FQDN, but that controls Databricks outbound policy; it does not replace the vendorโ€™s inbound source-IP allowlist requirement.