Hi there,
I have pretty much the exact code you have here, and yet it still doesnt work, saying "No such file or directory"
Is this a limitation of the community edition?
import requests
CHUNK_SIZE=4096
def get_remote_file(dataSrcUrl, destFile):
'''Simple old skool python function to load a remote url into local hdfs '''
destFile = "/dbfs" + destFile
#
with requests.get(dataSrcUrl, stream=True) as resp:
if resp.ok:
with open(destFile, "wb") as f:
for chunk in resp.iter_content(chunk_size=CHUNK_SIZE):
f.write(chunk)
get_remote_file("https://gitlab.com/opstar/share20/-/raw/master/university.json", "/Filestore/data/lgdt/university.json" )
The directory "dbfs:/Filestore/data/lgdt" definitely exists as i can see it when running the dbutils.fs.ls(path) command