cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

SFTP Connect

ajain80
New Contributor III

How I can connect sftp server from databricks. So I can write files into tables directly?

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III

The classic solution is to copy data from FTP to ADLS storage using Azure Data Factory, and after the copy is done in the ADF pipeline, trigger the databricks notebook.

View solution in original post

5 REPLIES 5

Debayan
Databricks Employee
Databricks Employee

kalo1
New Contributor II

Thank you for the link, you made my day.

Hubert-Dudek
Esteemed Contributor III

The classic solution is to copy data from FTP to ADLS storage using Azure Data Factory, and after the copy is done in the ADF pipeline, trigger the databricks notebook.

Cronk
New Contributor II

Thank you, I will try it. I hope it works. If you're looking for the best online writing services for your college assignments, be sure to visit https://www.topwritersreview.com/reviews/essayusa/ It's an excellent resource that can help you make an informed decision and achieve your academic goals with confidence.

Is there a way to copy data from FTP into ADLS using PySpark in a Databricks Notebook (i.e. without ADF)?
The following library (https://github.com/springml/spark-sftp) that some have recommended does not seem to include a PySpark API.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now