Try using below. python components need prefix '/dbfs' in path. since you are using output of dbutils.fs.ls it will have prefix as 'dbfs:'Replace loader = TextLoader(i[0]) with loader = TextLoader(i[0].replace('dbfs:','/dbfs'))
We use Token generated in Account console to make REST API calls. you can try this work around if your use case allows to use token instead of userid/password. Below is an example in python. user_json = { "schemas": [ "urn:ietf:params:scim:schemas:...
I assume you cannot copy files from Local machine to dbfs using dbutils. you can upload files to dbfs using below gui option . Data --> Browse DFS --> Upload
_metadata will provide file modification timestamp. I tried on dbfs but not sure for ADLS.https://docs.databricks.com/ingestion/file-metadata-column.html