- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2022 01:49 PM
I built a machine learning model:
lr = LinearRegression()
lr.fit(X_train, y_train)
which I can save to the filestore by:
filename = "/dbfs/FileStore/lr_model.pkl"
with open(filename, 'wb') as f:
pickle.dump(lr, f)
Ideally, I wanted to save the model directly to a workspace or a repo so I tried:
filename = "/Users/user/lr_model.pkl"
os.makedirs(os.path.dirname(filename), exist_ok=True)
with open(filename, 'wb') as f:
pickle.dump(lr, f)
but it is not working because the file is not showing up in the workspace.
The only alternative I have now is to transfer the model from the filestore to the workspace or a repo, how do I go about that?
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2022 04:30 PM
It's important to keep in mind that there are 2 file systems:
- The file system on the local machines that are part of the cluster
- The distributed file system https://docs.databricks.com/data/databricks-file-system.html
When you use python w/out spark such as with sklearn, its only on the driver and local is local on the driver. That will go away when the cluster does.
Try %sh ls / and %fs ls and see the differences
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-28-2022 04:30 PM
It's important to keep in mind that there are 2 file systems:
- The file system on the local machines that are part of the cluster
- The distributed file system https://docs.databricks.com/data/databricks-file-system.html
When you use python w/out spark such as with sklearn, its only on the driver and local is local on the driver. That will go away when the cluster does.
Try %sh ls / and %fs ls and see the differences
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-01-2022 07:25 AM
Workspace and Repo is not full available via dbfs as they have separate access rights. It is better to use MLFlow for your models as it is like git but for ML. I think using MLOps you can than put your model also to git.

