- 7829 Views
- 11 replies
- 3 kudos
- 7829 Views
- 11 replies
- 3 kudos
Latest Reply
Can you not use a No Isolation Shared cluster with Table access controls enabled on workspace level?
10 More Replies
- 5161 Views
- 5 replies
- 0 kudos
I have sql warehouse endpoints that work fine when querying from applications such as Tableau, but just running the included sample query against a running endpoint from the Query Editor from the workspace is returning "Unable to upload to DBFS Query...
- 5161 Views
- 5 replies
- 0 kudos
Latest Reply
Hi @Marvin Ginns​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...
4 More Replies
by
Bas1
• New Contributor III
- 8792 Views
- 17 replies
- 20 kudos
In Azure Databricks the DBFS storage account is open to all networks. Changing that to use a private endpoint or minimizing access to selected networks is not allowed.Is there any way to add network security to this storage account? Alternatively, is...
- 8792 Views
- 17 replies
- 20 kudos
Latest Reply
How can we secure the storage account in the managed resource group which holds the DBFS with restricted network access, since access from all networks is blocked by our Azure storage account policy?
16 More Replies
- 15102 Views
- 4 replies
- 0 kudos
- 15102 Views
- 4 replies
- 0 kudos
Latest Reply
db_path = 'file:///Workspace/Users/l<xxxxx>@databricks.com/TITANIC_DEMO/tested.csv'
df = spark.read.csv(db_path, header = "True", inferSchema="True")
3 More Replies
- 4358 Views
- 4 replies
- 0 kudos
I have took the azure datasets that are available for practice. I got the 10 days data from that dataset and now I want to save this data into DBFS in csv format. I have facing an error :" No such file or directory: 'No such file or directory: '/dbfs...
- 4358 Views
- 4 replies
- 0 kudos
Latest Reply
Hi,after some exercise you need to aware folder create in dbutils.fs.mkdirs("/dbfs/tmp/myfolder") it's created in /dbfs/dbfs/tmp/myfolderif you want to access path to_csv("/dbfs/tmp/myfolder/mytest.csv") you should created with this script dbutils.fs...
3 More Replies
- 25155 Views
- 4 replies
- 1 kudos
I have been trying to embed the image from the dbfs location, when I run the code, the image is unknown or question mark.
I have tried following code:
The path of the file is dbfs:/FileStore/tables/svm.jpgdisplayHTML("<img src ='dbfs:/FileStore/tabl...
- 25155 Views
- 4 replies
- 1 kudos
Latest Reply
Is there a way to embed an image from mounted storage into my markdown cell? Or can this only be done using the dbfs files?
3 More Replies
by
kinsun
• New Contributor II
- 11523 Views
- 5 replies
- 0 kudos
Dear Databricks Expert,I got some doubts when dealing with DBFS and Local File System.Case01: Copy a file from ADLS to DBFS. I am able to do so through the below python codes:#spark.conf.set("fs.azure.account.auth.type", "OAuth") spark.conf.set("fs.a...
- 11523 Views
- 5 replies
- 0 kudos
Latest Reply
Hi @KS LAU​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your q...
4 More Replies
by
bearys
• New Contributor II
- 2122 Views
- 2 replies
- 2 kudos
I have a large delta table partitioned by an identifier column that I now have discovered has blank spaces in some of the identifiers, e.g. one partition can be defined by "Identifier=first identifier". Most partitions does not have these blank space...
- 2122 Views
- 2 replies
- 2 kudos
Latest Reply
Hi @bearys, The error message suggests an illegal character in the path at a specific index.
The error is pointing to a blank space in the path "dbfs:/mnt/container/table_name/Identifier=first identifier/part-01347-8a9a157b-6d0d-75dd-b1b7-2aed12e057...
1 More Replies
- 78620 Views
- 5 replies
- 4 kudos
I have run the WordCount program and have saved the output into a directory as follows
counts.saveAsTextFile("/users/data/hobbit-out1")
subsequently I check that the output directory contains the expected number of files
%fs ls /users/data/hobbit-ou...
- 78620 Views
- 5 replies
- 4 kudos
Latest Reply
@PrithwisMukerje ,
To download a file from DBFS to your local computer filesystem, you can use the Databricks CLI command databricks fs cp.
Here are the steps:
1. Open a terminal or command prompt on your local computer.2. Run the follow...
4 More Replies
- 14228 Views
- 17 replies
- 7 kudos
I am new to learning Spark and working on some practice; I have uploaded a zip file in DBFS /FileStore/tables directory and trying to run a python code to unzip the file; The python code is as: from zipfile import *with ZipFile("/FileStore/tables/fli...
- 14228 Views
- 17 replies
- 7 kudos
Latest Reply
What if changing the runtime is not an option? I'm experiencing a similar issue using the following:%pip install -r /dbfs/path/to/file.txtThis worked for a while, but now I'm getting the Errno 2 mentioned above. I am still able to print the same file...
16 More Replies
- 892 Views
- 2 replies
- 1 kudos
We identify a potential bug in either DBFS or Pandas that when writting a dataframe using Pandas `to_csv`, `to_parquet`, `to_pickle` etc to a mounted ADLS location with read-only service principle didn't throw permission deny exceptions. However, met...
- 892 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Yung-Hang Chang​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...
1 More Replies
- 1648 Views
- 3 replies
- 0 kudos
Is there a built-in utility function, e.g., dbutils, that can convert between path strings that start with "dbfs:" and "/dbfs"?Some operations, e.g, copying from one location in DBFS to another using dbutils.fs.cp() expect the path starting with "/db...
- 1648 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @Fijoy Vadakkumpadan​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best a...
2 More Replies
- 5002 Views
- 5 replies
- 3 kudos
I was going through Data Engineering with Databricks training, and in DE 3.3L - Databases, Tables & Views Lab section, it says "Defining database directories for groups of users can greatly reduce the chances of accidental data exfiltration." I agree...
- 5002 Views
- 5 replies
- 3 kudos
Latest Reply
Hi @Dilorom A​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...
4 More Replies
- 11086 Views
- 3 replies
- 0 kudos
hi ,
Im a newbie learning spark using databricks , I did some investigation and searched if this questions was been asked earlier in community forum but unable to find anything so .
1. DBFS is unable to detect the file even though its present in it...
- 11086 Views
- 3 replies
- 0 kudos
Latest Reply
I am having similar issues currently. I can read or access my storage account but when I attempted to read or access the container it told me path not found. I create the container and have full access as an owner.
2 More Replies