cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How can I save a keras model from a python notebook in databricks to an s3 bucket?

manupmanoos
New Contributor III

I have a trained model on Databricks python notebook. How can I save this to an s3 bucket.

2 ACCEPTED SOLUTIONS

Accepted Solutions

Kumaran
Databricks Employee
Databricks Employee

Hi @manupmanoos,

Thank you for posting your question in Databricks community.

Here are the steps to save a Keras model from a Python notebook in Databricks to AWS S3 bucket:

  1. Install the AWS SDK and set up your credentials using Databricks Secret Manager or environment variables.
  2. Train a Keras model, for example using model = keras.models.Sequential().
  3. Save the model locally using model.save('/dbfs/models/model.h5') command.
  4. Use the aws command to copy the saved model file to an S3 bucket. Here is an example:

 

# Set credentials and create S3 client
import boto3
import os

aws_access_key_id = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
aws_secret_access_key = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
os.environ['AWS_ACCESS_KEY_ID'] = aws_access_key_id
os.environ['AWS_SECRET_ACCESS_KEY'] = aws_secret_access_key

s3_client = boto3.client('s3')

# Upload the saved model from local file system to S3 bucket
s3_bucket = "<bucket-name>"
s3_prefix = "<bucket-prefix>"

model_path = "/dbfs/models/model.h5"
s3_key = "{}/model.h5".format(s3_prefix)

s3_client.upload_file(model_path, s3_bucket, s3_key)

 

 

View solution in original post

Kumaran
Databricks Employee
Databricks Employee

Hi @manupmanoos,

Please check the below code on how to load the saved model back from the s3 bucket

import boto3
import os
from keras.models import load_model

# Set credentials and create S3 client
aws_access_key_id = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
aws_secret_access_key = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
os.environ['AWS_ACCESS_KEY_ID'] = aws_access_key_id
os.environ['AWS_SECRET_ACCESS_KEY'] = aws_secret_access_key

s3_client = boto3.client('s3')

# Specify the S3 bucket and model file path
s3_bucket = "<bucket-name>"
s3_prefix = "<bucket-prefix>"
s3_key = "{}/model.h5".format(s3_prefix)

# Download the model file from S3
local_model_path = "/dbfs/models/model.h5"
s3_client.download_file(s3_bucket, s3_key, local_model_path)

# Load the model using Keras
loaded_model = load_model(local_model_path)

View solution in original post

5 REPLIES 5

Kumaran
Databricks Employee
Databricks Employee

Hi @manupmanoos,

Thank you for posting your question in Databricks community.

Here are the steps to save a Keras model from a Python notebook in Databricks to AWS S3 bucket:

  1. Install the AWS SDK and set up your credentials using Databricks Secret Manager or environment variables.
  2. Train a Keras model, for example using model = keras.models.Sequential().
  3. Save the model locally using model.save('/dbfs/models/model.h5') command.
  4. Use the aws command to copy the saved model file to an S3 bucket. Here is an example:

 

# Set credentials and create S3 client
import boto3
import os

aws_access_key_id = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
aws_secret_access_key = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
os.environ['AWS_ACCESS_KEY_ID'] = aws_access_key_id
os.environ['AWS_SECRET_ACCESS_KEY'] = aws_secret_access_key

s3_client = boto3.client('s3')

# Upload the saved model from local file system to S3 bucket
s3_bucket = "<bucket-name>"
s3_prefix = "<bucket-prefix>"

model_path = "/dbfs/models/model.h5"
s3_key = "{}/model.h5".format(s3_prefix)

s3_client.upload_file(model_path, s3_bucket, s3_key)

 

 

manupmanoos
New Contributor III

Hi @Kumaran ,

Could you please let me know how we would we load the same saved model back to a databricks notebook?

Thanks,

Manu

Kumaran
Databricks Employee
Databricks Employee

Hi @manupmanoos,

Thank you for posting your question in the Databricks community.

Below is an example of how to load the same saved model back to a Databricks notebook

 

from keras.models import load_model
 
model = load_model('/dbfs/models/model.h5')

manupmanoos
New Contributor III

Hi @Kumaran ,

Sorry, I was not clear with my question. How do we load the saved model back from the s3 bucket?

Thanks,

Manu

Kumaran
Databricks Employee
Databricks Employee

Hi @manupmanoos,

Please check the below code on how to load the saved model back from the s3 bucket

import boto3
import os
from keras.models import load_model

# Set credentials and create S3 client
aws_access_key_id = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
aws_secret_access_key = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
os.environ['AWS_ACCESS_KEY_ID'] = aws_access_key_id
os.environ['AWS_SECRET_ACCESS_KEY'] = aws_secret_access_key

s3_client = boto3.client('s3')

# Specify the S3 bucket and model file path
s3_bucket = "<bucket-name>"
s3_prefix = "<bucket-prefix>"
s3_key = "{}/model.h5".format(s3_prefix)

# Download the model file from S3
local_model_path = "/dbfs/models/model.h5"
s3_client.download_file(s3_bucket, s3_key, local_model_path)

# Load the model using Keras
loaded_model = load_model(local_model_path)

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group