- 8130 Views
- 2 replies
- 6 kudos
I'm wanting to store a notebook with functions two folders up from the current notebook. I know that I can start the path with ../ to go up one folder but when I've tried .../ it won't go up two folders. Is there a way to do this?
- 8130 Views
- 2 replies
- 6 kudos
Latest Reply
In order to access a notebook in the current folder use ../notebook_2to go 2 folders up and access (say notebook "secret") use ../../secret
1 More Replies
- 4796 Views
- 1 replies
- 1 kudos
I am reading the data from a folder /mnt/lake/customer where mnt/lake is the mount path referring to ADLS Gen 2, Now I would like to rename a folder from /mnt/lake/customer to /mnt/lake/customeraddress without copying the data from one folder to ano...
- 4796 Views
- 1 replies
- 1 kudos
Latest Reply
Atanu
Databricks Employee
https://docs.databricks.com/data/databricks-file-system.html#local-file-api-limitations this might help @Simhadri Raju
by
wyzer
• Contributor II
- 3499 Views
- 2 replies
- 4 kudos
Hello,How to show the properties of the folders/files from DBFS ?Currently i am using this command :display(dbutils.fs.ls("dbfs:/"))But it only shows :pathnamesizeHow to show these properties ? : CreatedBy (Name)CreatedOn (Date)ModifiedBy (Name)Modi...
- 3499 Views
- 2 replies
- 4 kudos
Latest Reply
Only one idea is to use %sh magic command but there is no name (just root)
1 More Replies
by
nmud19
• New Contributor II
- 68224 Views
- 8 replies
- 6 kudos
I have a folder at location dbfs:/mnt/temp
I need to delete this folder. I tried using
%fs rm mnt/temp
&
dbutils.fs.rm("mnt/temp")
Could you please help me out with what I am doing wrong?
- 68224 Views
- 8 replies
- 6 kudos
Latest Reply
use this (last raw should not be indented twice...):
def delete_mounted_dir(dirname):
files=dbutils.fs.ls(dirname)
for f in files:
if f.isDir():
delete_mounted_dir(f.path)
dbutils.fs.rm(f.path, recurse=True)
7 More Replies
by
asher
• New Contributor II
- 9064 Views
- 1 replies
- 0 kudos
I am trying to find a way to list all files, and related file sizes, in all folders and all sub folders. I guess these are called blobs, in the Databricks world. Anyway, I can easily list all files, and related file sizes, in one single folder, but ...
- 9064 Views
- 1 replies
- 0 kudos
Latest Reply
from azure.storage.blob import BlockBlobService block_blob_service = BlockBlobService(account_name='your_acct_name', account_key='your_acct_key') mylist = [] generator = block_blob_service.list_blobs('rawdata') for blob in generator: mylist.append(...