Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
5 hours ago
If you want to read from your Azure storage account using Databricks Free Edition, you can add a specific option when reading:
spark.read.option("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net",
"your_storage_account_key")
I just tried and it worked 🙂 However, I could not achieve the same with dbutils.fs.ls("abfss://....") which works well in Azure Databricks, but I cannot find where I can pass the same option.