I am trying to iterate over a container(adls gen2) in azure databricks using pyspark. basically I am using dbutils.fs.ls to list the contents for folder using recursive function.
this logic works perfectly fine for all folder except for the folders ehich has [] in its name.
I am getting java.uri exception saying that there are illegal characters, I tried to encode the folderpath but then I am getting filenotfound error as dbx is trying to list a folder that has encoded characters.
foldername: container/abd/qwe[rt]
Looks like this is a spark issue.