In AWS Console, in "My security credentials," please generate a new access key and secret key,
Set them as env variables:
sc._jsc.hadoopConfiguration().set("fs.s3n.awsAccessKeyId", ACCESS_KEY)
sc._jsc.hadoopConfiguration().set("fs.s3n.awsSecretAccessKey", SECRET_KEY)
Now you can read files from your S3 bucket directly
df = spark.read.csv("https://gateway.storjshare.io/test.csv”", header=True, inferSchema=True)
you can as well mount a bucket permanently using that command
dbutils.fs.mount(f"s3a://{ACCESS_KEY}:{SECRET_KEY}@{aws_bucket_name}", f"/mnt/{mount_name}")
It is safer to use a key vault to store your access key and secret key