by
clant
• New Contributor II
- 1402 Views
- 1 replies
- 4 kudos
Hello,Is it possible to use a SFTP location to load from for structured streaming.At the moment we are going from SFTP->S3->databricks via structured streaming. I would like to cut out the S3 part.CheersChris
- 1402 Views
- 1 replies
- 4 kudos
Latest Reply
Hi @Chris Lant Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks.
- 6095 Views
- 1 replies
- 3 kudos
I have zip file on SFTP location. I want to copy that file from SFTP location and put it into Azure Data lake and want to unzip there using spark notebook. Please help me to solve this.
- 6095 Views
- 1 replies
- 3 kudos
Latest Reply
I would go with @Kaniz Fatma approach and download data in Data Factory and after is downloaded on success trigger databricks spark notebook. With spark you can read also compressed data so maybe you will not need to do even separate unzip.