- 8192 Views
- 8 replies
- 1 kudos
Hi,When reading Delta Lake file (created by Auto Loader) with this code: df = ( spark.readStream .format('cloudFiles') .option("cloudFiles.format", "delta") .option("cloudFiles.schemaLocation", f"{silver_path}/_checkpoint") .load(bronz...
- 8192 Views
- 8 replies
- 1 kudos
Latest Reply
@Vladif1 The error occurs because the cloudFiles format in Auto Loader is meant for reading raw file formats like CSV, JSON ... for ingestion for more Format Support. For Delta tables, you should use the Delta format directly. #Sample Example
bronze...
7 More Replies
by
a2_ish
• New Contributor II
- 2760 Views
- 1 replies
- 0 kudos
I have below code which works for the path below but fails for path = azure storage account path. i have enough access to write and update the storage account. I would like to know what wrong am I doing and the path below which works , how can i phys...
- 2760 Views
- 1 replies
- 0 kudos
Latest Reply
@Ankit Kumar :The error message you received indicates that the user does not have sufficient permission to access the Azure Blob Storage account. You mentioned that you have enough access to write and update the storage account, but it's possible t...