by
Bas1
• New Contributor III
- 5953 Views
- 17 replies
- 20 kudos
In Azure Databricks the DBFS storage account is open to all networks. Changing that to use a private endpoint or minimizing access to selected networks is not allowed.Is there any way to add network security to this storage account? Alternatively, is...
- 5953 Views
- 17 replies
- 20 kudos
Latest Reply
How can we secure the storage account in the managed resource group which holds the DBFS with restricted network access, since access from all networks is blocked by our Azure storage account policy?
16 More Replies
- 4650 Views
- 7 replies
- 3 kudos
- 4650 Views
- 7 replies
- 3 kudos
Latest Reply
There are two ways to grant access to DBFS using ANY FILE:To User: GRANT SELECT ON ANY FILE TO '<user_mail_id>'To Group: GRANT SELECT ON ANY FILE TO '<group_name>'"
6 More Replies
- 10116 Views
- 4 replies
- 0 kudos
- 10116 Views
- 4 replies
- 0 kudos
Latest Reply
db_path = 'file:///Workspace/Users/l<xxxxx>@databricks.com/TITANIC_DEMO/tested.csv'
df = spark.read.csv(db_path, header = "True", inferSchema="True")
3 More Replies
- 3351 Views
- 4 replies
- 0 kudos
I have took the azure datasets that are available for practice. I got the 10 days data from that dataset and now I want to save this data into DBFS in csv format. I have facing an error :" No such file or directory: 'No such file or directory: '/dbfs...
- 3351 Views
- 4 replies
- 0 kudos
Latest Reply
Hi,after some exercise you need to aware folder create in dbutils.fs.mkdirs("/dbfs/tmp/myfolder") it's created in /dbfs/dbfs/tmp/myfolderif you want to access path to_csv("/dbfs/tmp/myfolder/mytest.csv") you should created with this script dbutils.fs...
3 More Replies
- 21681 Views
- 4 replies
- 1 kudos
I have been trying to embed the image from the dbfs location, when I run the code, the image is unknown or question mark.
I have tried following code:
The path of the file is dbfs:/FileStore/tables/svm.jpgdisplayHTML("<img src ='dbfs:/FileStore/tabl...
- 21681 Views
- 4 replies
- 1 kudos
Latest Reply
Is there a way to embed an image from mounted storage into my markdown cell? Or can this only be done using the dbfs files?
3 More Replies
by
kinsun
• New Contributor II
- 7082 Views
- 5 replies
- 0 kudos
Dear Databricks Expert,I got some doubts when dealing with DBFS and Local File System.Case01: Copy a file from ADLS to DBFS. I am able to do so through the below python codes:#spark.conf.set("fs.azure.account.auth.type", "OAuth") spark.conf.set("fs.a...
- 7082 Views
- 5 replies
- 0 kudos
Latest Reply
Hi @KS LAU​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your q...
4 More Replies
by
bearys
• New Contributor II
- 1411 Views
- 2 replies
- 2 kudos
I have a large delta table partitioned by an identifier column that I now have discovered has blank spaces in some of the identifiers, e.g. one partition can be defined by "Identifier=first identifier". Most partitions does not have these blank space...
- 1411 Views
- 2 replies
- 2 kudos
Latest Reply
Hi @bearys, The error message suggests an illegal character in the path at a specific index.
The error is pointing to a blank space in the path "dbfs:/mnt/container/table_name/Identifier=first identifier/part-01347-8a9a157b-6d0d-75dd-b1b7-2aed12e057...
1 More Replies
- 69330 Views
- 5 replies
- 4 kudos
I have run the WordCount program and have saved the output into a directory as follows
counts.saveAsTextFile("/users/data/hobbit-out1")
subsequently I check that the output directory contains the expected number of files
%fs ls /users/data/hobbit-ou...
- 69330 Views
- 5 replies
- 4 kudos
Latest Reply
@PrithwisMukerje ,
To download a file from DBFS to your local computer filesystem, you can use the Databricks CLI command databricks fs cp.
Here are the steps:
1. Open a terminal or command prompt on your local computer.2. Run the follow...
4 More Replies
- 10858 Views
- 17 replies
- 7 kudos
I am new to learning Spark and working on some practice; I have uploaded a zip file in DBFS /FileStore/tables directory and trying to run a python code to unzip the file; The python code is as: from zipfile import *with ZipFile("/FileStore/tables/fli...
- 10858 Views
- 17 replies
- 7 kudos
Latest Reply
What if changing the runtime is not an option? I'm experiencing a similar issue using the following:%pip install -r /dbfs/path/to/file.txtThis worked for a while, but now I'm getting the Errno 2 mentioned above. I am still able to print the same file...
16 More Replies
- 605 Views
- 2 replies
- 1 kudos
We identify a potential bug in either DBFS or Pandas that when writting a dataframe using Pandas `to_csv`, `to_parquet`, `to_pickle` etc to a mounted ADLS location with read-only service principle didn't throw permission deny exceptions. However, met...
- 605 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Yung-Hang Chang​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...
1 More Replies
- 1173 Views
- 3 replies
- 0 kudos
Is there a built-in utility function, e.g., dbutils, that can convert between path strings that start with "dbfs:" and "/dbfs"?Some operations, e.g, copying from one location in DBFS to another using dbutils.fs.cp() expect the path starting with "/db...
- 1173 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @Fijoy Vadakkumpadan​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best a...
2 More Replies
- 3329 Views
- 5 replies
- 3 kudos
I was going through Data Engineering with Databricks training, and in DE 3.3L - Databases, Tables & Views Lab section, it says "Defining database directories for groups of users can greatly reduce the chances of accidental data exfiltration." I agree...
- 3329 Views
- 5 replies
- 3 kudos
Latest Reply
Hi @Dilorom A​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...
4 More Replies
- 9584 Views
- 3 replies
- 0 kudos
hi ,
Im a newbie learning spark using databricks , I did some investigation and searched if this questions was been asked earlier in community forum but unable to find anything so .
1. DBFS is unable to detect the file even though its present in it...
- 9584 Views
- 3 replies
- 0 kudos
Latest Reply
I am having similar issues currently. I can read or access my storage account but when I attempted to read or access the container it told me path not found. I create the container and have full access as an owner.
2 More Replies
by
aki1
• New Contributor II
- 1186 Views
- 2 replies
- 1 kudos
I would like to download a file in DBFS using the FileStore Endpoint.If the file or folder name contains multibyte characters, the file path cannot be specified due to URL encoding and an error occurs.Question 1: If a file or folder name contains mul...
- 1186 Views
- 2 replies
- 1 kudos
Latest Reply
Hi,Databricks CLI can be used to download a file from DBFS. https://docs.databricks.com/dev-tools/cli/index.htmlAlso, you can refer to https://stackoverflow.com/questions/49019706/databricks-download-a-dbfs-filestore-file-to-my-local-machine , which ...
1 More Replies