cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Transfer files saved in filestore to either the workspace or to a repo

MichaelO
New Contributor III

I built a machine learning model:

lr = LinearRegression()
lr.fit(X_train, y_train)

which I can save to the filestore by:

filename = "/dbfs/FileStore/lr_model.pkl"
with open(filename, 'wb') as f:
    pickle.dump(lr, f)

Ideally, I wanted to save the model directly to a workspace or a repo so I tried:

filename = "/Users/user/lr_model.pkl"
os.makedirs(os.path.dirname(filename), exist_ok=True)
with open(filename, 'wb') as f:
    pickle.dump(lr, f)

but it is not working because the file is not showing up in the workspace.

The only alternative I have now is to transfer the model from the filestore to the workspace or a repo, how do I go about that?

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

It's important to keep in mind that there are 2 file systems:

  1. The file system on the local machines that are part of the cluster
  2. The distributed file system https://docs.databricks.com/data/databricks-file-system.html

When you use python w/out spark such as with sklearn, its only on the driver and local is local on the driver. That will go away when the cluster does.

Try %sh ls / and %fs ls and see the differences

View solution in original post

2 REPLIES 2

Anonymous
Not applicable

It's important to keep in mind that there are 2 file systems:

  1. The file system on the local machines that are part of the cluster
  2. The distributed file system https://docs.databricks.com/data/databricks-file-system.html

When you use python w/out spark such as with sklearn, its only on the driver and local is local on the driver. That will go away when the cluster does.

Try %sh ls / and %fs ls and see the differences

Hubert-Dudek
Esteemed Contributor III

Workspace and Repo is not full available via dbfs as they have separate access rights. It is better to use MLFlow for your models as it is like git but for ML. I think using MLOps you can than put your model also to git.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group