โ06-15-2023 05:56 AM
According to the documentation, the usage of external locations is preferred over the use of mount points.
Unfortunately the basic funtionality to manipulate files seems to be missing.
This is my scenario:
dbutils.fs.mkdirs(NewPath) does not work --> Operation failed: "This request is not authorized to perform this operation."
f = open(fullFileName, 'w+b') --> FileNotFoundError: [Errno 2] No such file or directory
f.write(ZipBinaryData)
f.close()
dbutils.fs.rm(NewPath,True) --> Operation failed: "This request is not authorized to perform this operation."
dbutils.fs.mv(NewPath,ArchivePathTrue) --> Operation failed: "This request is not authorized to perform this operation."
Any help or insights on how to get this working with external locations is greatly appreciated!
โ06-19-2023 05:50 AM
The main problem was related to the network configuration of the storage account: Databricks did not have access. Quite strange that it did manage to create folders...
Currently dbutils.fs functionality is working.
For the zipfile manipulation: that only works with local (or mounted) locations.
Workaround: copy to/from local storage to abfss when required
โ06-15-2023 07:20 PM
Sounds like a cloud provider permission issue. Which one are you using? Aws or Azure? How are you connecting to blob? Via external location with managed identity or sas token? The easiest method to test connectivity is to click test connection within the external location tab within "data" (bottom left). If that is successful you should test a simple read of the file directory...
dbutils.fs.ls("<blob url>")
โ06-15-2023 08:38 PM
Hi @Tjomme Vergauwenโ
We haven't heard from you since the last response from @Tyler Retzlaffโ โ, and I was checking back to see if her suggestions helped you.
Or else, If you have any solution, please share it with the community, as it can be helpful to others.
Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.
โ06-15-2023 11:43 PM
Hi,
We're using Azure.
External locations are created using a managed identity.
It's not a security issue as demonstrated below:
Same folder, different syntax to get the list of files. The first one works, the second one throws an error.
LIST 'abfss://landingzone@***.dfs.core.windows.net/DEV' --> works
%py
dbutils.fs.ls('abfss://landingzone@***.dfs.core.windows.net/DEV') --> throws error
โ06-16-2023 03:47 AM
Thats really weird... can you go into the external location in databricks' data tab and make sure your user has the right permissions?
โ06-16-2023 07:06 AM
it seems my access rights on the storage account are in order, but the ones on the container are missing. Reference: DataBricks UnityCatalog create table fails with "Failed to acquire a SAS token UnauthorizedAccessExc...
I'll have this changed and retry
โ06-16-2023 08:25 AM
Cool, let me know how it goes
โ06-19-2023 05:50 AM
The main problem was related to the network configuration of the storage account: Databricks did not have access. Quite strange that it did manage to create folders...
Currently dbutils.fs functionality is working.
For the zipfile manipulation: that only works with local (or mounted) locations.
Workaround: copy to/from local storage to abfss when required
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group