Hello @MGAutomation @szymon_dybczak
You may also need to open the firewall of your on-premises SQL Server to the CIDR range of your Databricks VPC. This ensures that the EC2 instances used by Databricks have valid IPs that can reach your database.
If what you are looking for is mainly to query or analyze data, and you already have connectivity through another tool or VPC such as AWS DMS or Airbyte (whether for testing or in a corporate environment), the easiest approach would be to export directly with those extractors into S3 and then read it from there.
Keep in mind that you could also use Lakehouse Federation to create on-premise connections by configuring a federated catalog pointing to your SQL Server. For this youโll need a Serverless or Pro SQL Warehouse if youโre using a SQL endpoint. This option will give you greater visibility and a more convenient way of working compared to a raw JDBC call, provided you have the right database permissions. Just be aware of the limitations that come with federation. Docs
Hope this helps ๐
Isi