Hi @knight22-21
Could you share a bit more detail so that we can help better?
- How are you trying to connect? (e.g., spark.conf.set, dbutils.fs.mount, Python SDK, etc.)
- What exact error are you seeing?
The answer depends heavily on your method:
- dbutils.fs.mount() โ not supported in Community Edition
- Python azure-storage-blob SDK โ may work, but you'd need to be able to pip install azure-storage-blob first
# This approach MIGHT work - worth trying:
pip install azure-storage-blob
from azure.storage.blob import BlobServiceClient
client = BlobServiceClient.from_connection_string("your_connection_string")โ
If none of these work due to network restrictions, the reliable fallback is uploading your file directly to a Unity Catalog Volume via the Databricks UI and reading from there.
What method are you currently using?
Angel Shrestha