- 7895 Views
- 6 replies
- 0 kudos
I have sql warehouse endpoints that work fine when querying from applications such as Tableau, but just running the included sample query against a running endpoint from the Query Editor from the workspace is returning "Unable to upload to DBFS Query...
- 7895 Views
- 6 replies
- 0 kudos
Latest Reply
Hi @Marvin Ginns Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers ...
5 More Replies
- 14842 Views
- 10 replies
- 3 kudos
- 14842 Views
- 10 replies
- 3 kudos
Latest Reply
Can you not use a No Isolation Shared cluster with Table access controls enabled on workspace level?
9 More Replies
by
Bas1
• New Contributor III
- 13041 Views
- 16 replies
- 20 kudos
In Azure Databricks the DBFS storage account is open to all networks. Changing that to use a private endpoint or minimizing access to selected networks is not allowed.Is there any way to add network security to this storage account? Alternatively, is...
- 13041 Views
- 16 replies
- 20 kudos
Latest Reply
How can we secure the storage account in the managed resource group which holds the DBFS with restricted network access, since access from all networks is blocked by our Azure storage account policy?
15 More Replies
- 24800 Views
- 4 replies
- 0 kudos
- 24800 Views
- 4 replies
- 0 kudos
Latest Reply
db_path = 'file:///Workspace/Users/l<xxxxx>@databricks.com/TITANIC_DEMO/tested.csv'
df = spark.read.csv(db_path, header = "True", inferSchema="True")
3 More Replies
- 6283 Views
- 4 replies
- 0 kudos
I have took the azure datasets that are available for practice. I got the 10 days data from that dataset and now I want to save this data into DBFS in csv format. I have facing an error :" No such file or directory: 'No such file or directory: '/dbfs...
- 6283 Views
- 4 replies
- 0 kudos
Latest Reply
Hi,after some exercise you need to aware folder create in dbutils.fs.mkdirs("/dbfs/tmp/myfolder") it's created in /dbfs/dbfs/tmp/myfolderif you want to access path to_csv("/dbfs/tmp/myfolder/mytest.csv") you should created with this script dbutils.fs...
3 More Replies
- 30148 Views
- 4 replies
- 1 kudos
I have been trying to embed the image from the dbfs location, when I run the code, the image is unknown or question mark.
I have tried following code:
The path of the file is dbfs:/FileStore/tables/svm.jpgdisplayHTML("<img src ='dbfs:/FileStore/tabl...
- 30148 Views
- 4 replies
- 1 kudos
Latest Reply
Is there a way to embed an image from mounted storage into my markdown cell? Or can this only be done using the dbfs files?
3 More Replies
by
kinsun
• New Contributor II
- 21867 Views
- 5 replies
- 1 kudos
Dear Databricks Expert,I got some doubts when dealing with DBFS and Local File System.Case01: Copy a file from ADLS to DBFS. I am able to do so through the below python codes:#spark.conf.set("fs.azure.account.auth.type", "OAuth") spark.conf.set("fs.a...
- 21867 Views
- 5 replies
- 1 kudos
Latest Reply
Hi @KS LAU Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your q...
4 More Replies
- 18774 Views
- 12 replies
- 4 kudos
I am new to learning Spark and working on some practice; I have uploaded a zip file in DBFS /FileStore/tables directory and trying to run a python code to unzip the file; The python code is as: from zipfile import *with ZipFile("/FileStore/tables/fli...
- 18774 Views
- 12 replies
- 4 kudos
Latest Reply
What if changing the runtime is not an option? I'm experiencing a similar issue using the following:%pip install -r /dbfs/path/to/file.txtThis worked for a while, but now I'm getting the Errno 2 mentioned above. I am still able to print the same file...
11 More Replies
- 1352 Views
- 2 replies
- 1 kudos
We identify a potential bug in either DBFS or Pandas that when writting a dataframe using Pandas `to_csv`, `to_parquet`, `to_pickle` etc to a mounted ADLS location with read-only service principle didn't throw permission deny exceptions. However, met...
- 1352 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Yung-Hang Chang Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...
1 More Replies
- 2285 Views
- 3 replies
- 0 kudos
Is there a built-in utility function, e.g., dbutils, that can convert between path strings that start with "dbfs:" and "/dbfs"?Some operations, e.g, copying from one location in DBFS to another using dbutils.fs.cp() expect the path starting with "/db...
- 2285 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @Fijoy Vadakkumpadan Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best a...
2 More Replies
- 6614 Views
- 3 replies
- 4 kudos
I was going through Data Engineering with Databricks training, and in DE 3.3L - Databases, Tables & Views Lab section, it says "Defining database directories for groups of users can greatly reduce the chances of accidental data exfiltration." I agree...
- 6614 Views
- 3 replies
- 4 kudos
Latest Reply
Hi @Dilorom A Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we ...
2 More Replies
- 13462 Views
- 3 replies
- 0 kudos
hi ,
Im a newbie learning spark using databricks , I did some investigation and searched if this questions was been asked earlier in community forum but unable to find anything so .
1. DBFS is unable to detect the file even though its present in it...
- 13462 Views
- 3 replies
- 0 kudos
Latest Reply
I am having similar issues currently. I can read or access my storage account but when I attempted to read or access the container it told me path not found. I create the container and have full access as an owner.
2 More Replies
by
aki1
• New Contributor II
- 2378 Views
- 2 replies
- 1 kudos
I would like to download a file in DBFS using the FileStore Endpoint.If the file or folder name contains multibyte characters, the file path cannot be specified due to URL encoding and an error occurs.Question 1: If a file or folder name contains mul...
- 2378 Views
- 2 replies
- 1 kudos
Latest Reply
Hi,Databricks CLI can be used to download a file from DBFS. https://docs.databricks.com/dev-tools/cli/index.htmlAlso, you can refer to https://stackoverflow.com/questions/49019706/databricks-download-a-dbfs-filestore-file-to-my-local-machine , which ...
1 More Replies
- 4650 Views
- 6 replies
- 7 kudos
How can I delete a file in DBFS with Illegal character?Someone put the file named "planejamento_[4098.]___SHORT_SAIA_JEANS__.xlsx" inside the folder /FileStore and I can delete it, because of this error: java.net.URISyntaxException: Illegal character...
- 4650 Views
- 6 replies
- 7 kudos
Latest Reply
try this %sh ls -li /dbfsif the file is located in a subdirectory you can change the path mentioned above.the %sh magic command gives you access to linux shell commands.
5 More Replies