Option 1:
Mount AWS S3 bucket
access_key = ""
secret_key = ""
encoded_secret_key = secret_key.replace("/", "%2F")
aws_bucket_name = "yourawsbucketname/"
mount_name = "youraliasmountname"
# #dbutils.fs.unmountf"/mnt/{mount_name}")
dbutils.fs.mount(f"s3a://{access_key}:{encoded_secret_key}@{aws_bucket_name}", f"/mnt/{mount_name}")
After mounting
goto next cell %fs ls /mnt/youraliasmountname
Option 2:
https://medium.com/@gchandra/databricks-how-to-load-data-from-google-drive-github-c98d6b34d1b5?sk=67...
~