cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Data_Engineer_3
by New Contributor III
  • 19427 Views
  • 12 replies
  • 4 kudos

FileNotFoundError: [Errno 2] No such file or directory: '/FileStore/tables/flight_data.zip' The data and file exists in location mentioned above

I am new to learning Spark and working on some practice; I have uploaded a zip file in DBFS /FileStore/tables directory and trying to run a python code to unzip the file; The python code is as: from zipfile import *with ZipFile("/FileStore/tables/fli...

  • 19427 Views
  • 12 replies
  • 4 kudos
Latest Reply
883022
New Contributor II
  • 4 kudos

What if changing the runtime is not an option? I'm experiencing a similar issue using the following:%pip install -r /dbfs/path/to/file.txtThis worked for a while, but now I'm getting the Errno 2 mentioned above. I am still able to print the same file...

  • 4 kudos
11 More Replies
Tonny_Stark
by New Contributor III
  • 3502 Views
  • 3 replies
  • 0 kudos

FileNotFoundError: [Errno 2] No such file or directory:

I have the following error code in databricks when I want to unzip filesFileNotFoundError: [Errno 2] No such file or directory:  but the file is there I already tried several ways and nothing worksI have tried modifying by placing/dbfs/mnt/dbfs/mnt/d...

error
  • 3502 Views
  • 3 replies
  • 0 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 0 kudos

@Alfredo Vallejos​ then your file is tar.gz file right, have you tried tar command instead of unzip

  • 0 kudos
2 More Replies
kkawka1
by New Contributor III
  • 3683 Views
  • 6 replies
  • 5 kudos

How to delete strings from the /FileStore/

We have just started working with databricks in one of my university modules, and the lecturers gave us a set of commands to practice saving data in the FileStore. One of the commands was the following:dbutils .fs.cp("/ databricks - datasets / weathh...

  • 3683 Views
  • 6 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Konrad Kawka​  I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest ...

  • 5 kudos
5 More Replies
aki1
by New Contributor II
  • 2495 Views
  • 2 replies
  • 1 kudos

How to download a file in DBFS that contains multibyte characters in the file path?

I would like to download a file in DBFS using the FileStore Endpoint.If the file or folder name contains multibyte characters, the file path cannot be specified due to URL encoding and an error occurs.Question 1: If a file or folder name contains mul...

  • 2495 Views
  • 2 replies
  • 1 kudos
Latest Reply
Debayan
Databricks Employee
  • 1 kudos

Hi,Databricks CLI can be used to download a file from DBFS. https://docs.databricks.com/dev-tools/cli/index.htmlAlso, you can refer to https://stackoverflow.com/questions/49019706/databricks-download-a-dbfs-filestore-file-to-my-local-machine , which ...

  • 1 kudos
1 More Replies
kkawka1
by New Contributor III
  • 10854 Views
  • 7 replies
  • 10 kudos

Resolved! Removing files saved in the root FileStore

We have just started working with databricks in one of my university modules, and the lecturers gave us a set of commands to practice saving data in the FileStore. One of the commands was the following:dbutils .fs.cp("/ databricks - datasets / weathh...

  • 10854 Views
  • 7 replies
  • 10 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 10 kudos

you can delete files using the data explorer in Databricks web UI.another option is to use %fs or %sh in a notebook.

  • 10 kudos
6 More Replies
alejandrofm
by Valued Contributor
  • 2688 Views
  • 4 replies
  • 2 kudos

Resolved! Orphan (?) files on Databricks S3 bucket

Hi, I'm seeing a lot of empty (and not) directories on routes like:xxxxxx.jobs/FileStore/job-actionstats/xxxxxx.jobs/FileStore/job-result/xxxxxx.jobs/command-results/Can I create a lifecycle to delete old objects (files/directories)? how many days? w...

  • 2688 Views
  • 4 replies
  • 2 kudos
Latest Reply
alejandrofm
Valued Contributor
  • 2 kudos

Hi! I didn't know that, Purging right now, is there a way to schedule that so logs are retained for less time? Maybe I want to maintain the last 7 days for everything?Thanks!

  • 2 kudos
3 More Replies
MichaelO
by New Contributor III
  • 13043 Views
  • 2 replies
  • 2 kudos

Resolved! Transfer files saved in filestore to either the workspace or to a repo

I built a machine learning model:lr = LinearRegression() lr.fit(X_train, y_train)which I can save to the filestore by:filename = "/dbfs/FileStore/lr_model.pkl" with open(filename, 'wb') as f: pickle.dump(lr, f)Ideally, I wanted to save the model ...

  • 13043 Views
  • 2 replies
  • 2 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 2 kudos

Workspace and Repo is not full available via dbfs as they have separate access rights. It is better to use MLFlow for your models as it is like git but for ML. I think using MLOps you can than put your model also to git.

  • 2 kudos
1 More Replies
cfregly
by Contributor
  • 6018 Views
  • 3 replies
  • 0 kudos
  • 6018 Views
  • 3 replies
  • 0 kudos
Latest Reply
easimadi
New Contributor II
  • 0 kudos

Hello Pls help (Not an Answer), How do I download complete csv (>1000) result file in FileStore unto my laptop? I was trying to follow this instruction set SQL tutorial (Download All SQL - scala)

  • 0 kudos
2 More Replies
Labels