I need to connect to a server to retrieve some files using spark and a private ssh key. However, to manage the private key safely I need to store it as a secret in Azure Key Vault, which means I don't have the key as a file to pass down in the keyFilePat option.
The code I have is this:
import com.github.arcizon.spark.filetransfer._
val df= spark.read
.option("host", "...")
.option("port", "22")
.option("username", "...")
.option("keyFilePath", "path/to/privatekey")
.option("fileFormat", "csv")
.option("delimiter", "|")
.option("header", "true")
.option("inferSchema", "true")
.option("encoding", "UTF-8")
.sftp("path/to/my/data.csv")
Is there an alternative where I can pass down the contents of the private key to be able to load the data that I want to manipulate with spark? or maybe another way of solving this problem that I haven't seen?