by
Arty
• New Contributor II
- 6965 Views
- 5 replies
- 6 kudos
Hi AllCan you please advise how I can arrange loaded file deletion from Azure Storage upon its successful load via Autoloader? As I understood, Spark streaming "cleanSource" option is unavailable for Autoloader, so I'm trying to find the best way to ...
- 6965 Views
- 5 replies
- 6 kudos
Latest Reply
Hi @Artem Sachuk Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...
4 More Replies
- 4823 Views
- 6 replies
- 7 kudos
How can I delete a file in DBFS with Illegal character?Someone put the file named "planejamento_[4098.]___SHORT_SAIA_JEANS__.xlsx" inside the folder /FileStore and I can delete it, because of this error: java.net.URISyntaxException: Illegal character...
- 4823 Views
- 6 replies
- 7 kudos
Latest Reply
try this %sh ls -li /dbfsif the file is located in a subdirectory you can change the path mentioned above.the %sh magic command gives you access to linux shell commands.
5 More Replies
- 2621 Views
- 2 replies
- 3 kudos
I accidently delete manual paquet file in dbfs how can I recovery this recovery this file
- 2621 Views
- 2 replies
- 3 kudos
Latest Reply
Hi @pansiri panaudom ,There is no option restore deleted files in databricks .
1 More Replies
by
MBV3
• New Contributor III
- 1970 Views
- 1 replies
- 2 kudos
What is the best way to delete files from the gcp bucket inside spark job?
- 1970 Views
- 1 replies
- 2 kudos
Latest Reply
@M Baig yes you need just to create service account for databricks and than assign storage admin role to bucket. After that you can mount GCS standard way:bucket_name = "<bucket-name>"mount_name = "<mount-name>"dbutils.fs.mount("gs://%s" % bucket_na...