cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

How to write the delta files for managed table? how can I define the sink

a2_ish
New Contributor II

I have tried below code to write data in a delta table and save the delta files in a sink. I tried using azure storage as sink but I get error as not enough access, I can confirm that I have enough access to azure storage, however I can run the below code with path as users/ankit.kumar@databricks.com/datadash and it runs , however I am not sure where to find this path physically where can I see the delta files which has been created.

%sql
USE CATALOG hive_metastore;
CREATE DATABASE IF NOT EXISTS demo_db;
USE DATABASE demo_db;
 
#This gives error as shaded.databricks.org.apache.hadoop.fs.azure.AzureException: #hadoop_azure_shaded.com.microsoft.azure.storage.StorageException: This request is not #authorized to perform this operation using this permission.
path = "wasbs://abcstoragecontainer@azlogs.blob.core.windows.net"
(bronzeDF.writeStream
  .format('delta') 
  .outputMode("append") 
  .trigger(once=True) 
  .option("mergeSchema", "true")
  .option('checkpointLocation', path+"/bronze_checkpoint")
  .toTable('turbine_bronze')) #.start(path + "/turbine_bronze"))
 
 
#This path works but where is this path located..where can I find it
 
path = "/Users/ankit.kumar@databricks.com/demo_db"
(bronzeDF.writeStream
  .format('delta') 
  .outputMode("append") 
  .trigger(once=True) 
  .option("mergeSchema", "true")
  .option('checkpointLocation', path+"/bronze_checkpoint")
  .toTable('turbine_bronze')) #.start(path + "/turbine_bronze"))

2 REPLIES 2

Hubert-Dudek
Esteemed Contributor III

It is easier to mount ADLS as a folder as explained here:

https://community.databricks.com/s/feed/0D53f00001eQGOHCA4

Anonymous
Not applicable

Hi @Ankit Kumar​ 

Does @Hubert Dudek​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?

We'd love to hear from you.

Thanks!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.