cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

writing to blob storage from databricks (parquet format)

yagmur
New Contributor II

Hi, 
I am supposed to create transformation notebook. But i am having trouble when i am trying to save the transformed file into blob storage. 
I didn't use any layer, just the layer which i am performing transformation in. if i use wasbs i receive different error, or if i use abfss i receive different. I tried to mount and save them into my dbfs, it worked but i couldnt carry them into blob storage. 
When i use connection (azure services princble) i am getting the first error i mentioned. (the error pictures are attached)
Thanks in advance

PS: If i try to save to raw container which i mounted it and where it read the files, i will be saved. I dont understand even though i have connected without mount at the same time why i am having that error message? 

2 REPLIES 2

szymon_dybczak
Contributor III

Hi @yagmur ,

Did you assigned required permission to service principal on storage account?
And make sure you're configuring connection to storage account in proper way. You should have something similiar to the code below:


 

configs = {
            "fs.azure.account.auth.type"              : "OAuth",
            "fs.azure.account.oauth.provider.type"    : "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
            "fs.azure.account.oauth2.client.id"       :"your_client_id",
            "fs.azure.account.oauth2.client.secret"   : "your_secret",
            "fs.azure.account.oauth2.client.endpoint" : f"https://login.microsoftonline.com/your_tenant_id/oauth2/token"
        };

dbutils.fs.mount(
                    source = f"abfss://your_container@your_storage_account.dfs.core.windows.net/",
                    mount_point = f"/mnt/your_mount_point_name",
                    extra_configs = configs
                    );
             

 

 Anyway, nowadays you should use Unity Catalog and configure storage using Storage Credential

Thanks for the effort but that is the same what i did. Still not working

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group