โ08-09-2023 05:51 AM
I have a trained model on Databricks python notebook. How can I save this to an s3 bucket.
โ08-13-2023 11:47 PM - edited โ08-13-2023 11:48 PM
Hi @manupmanoos,
Thank you for posting your question in Databricks community.
Here are the steps to save a Keras model from a Python notebook in Databricks to AWS S3 bucket:
# Set credentials and create S3 client
import boto3
import os
aws_access_key_id = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
aws_secret_access_key = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
os.environ['AWS_ACCESS_KEY_ID'] = aws_access_key_id
os.environ['AWS_SECRET_ACCESS_KEY'] = aws_secret_access_key
s3_client = boto3.client('s3')
# Upload the saved model from local file system to S3 bucket
s3_bucket = "<bucket-name>"
s3_prefix = "<bucket-prefix>"
model_path = "/dbfs/models/model.h5"
s3_key = "{}/model.h5".format(s3_prefix)
s3_client.upload_file(model_path, s3_bucket, s3_key)
โ08-15-2023 10:35 AM
Hi @manupmanoos,
Please check the below code on how to load the saved model back from the s3 bucket
import boto3
import os
from keras.models import load_model
# Set credentials and create S3 client
aws_access_key_id = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
aws_secret_access_key = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
os.environ['AWS_ACCESS_KEY_ID'] = aws_access_key_id
os.environ['AWS_SECRET_ACCESS_KEY'] = aws_secret_access_key
s3_client = boto3.client('s3')
# Specify the S3 bucket and model file path
s3_bucket = "<bucket-name>"
s3_prefix = "<bucket-prefix>"
s3_key = "{}/model.h5".format(s3_prefix)
# Download the model file from S3
local_model_path = "/dbfs/models/model.h5"
s3_client.download_file(s3_bucket, s3_key, local_model_path)
# Load the model using Keras
loaded_model = load_model(local_model_path)
โ08-13-2023 11:47 PM - edited โ08-13-2023 11:48 PM
Hi @manupmanoos,
Thank you for posting your question in Databricks community.
Here are the steps to save a Keras model from a Python notebook in Databricks to AWS S3 bucket:
# Set credentials and create S3 client
import boto3
import os
aws_access_key_id = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
aws_secret_access_key = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
os.environ['AWS_ACCESS_KEY_ID'] = aws_access_key_id
os.environ['AWS_SECRET_ACCESS_KEY'] = aws_secret_access_key
s3_client = boto3.client('s3')
# Upload the saved model from local file system to S3 bucket
s3_bucket = "<bucket-name>"
s3_prefix = "<bucket-prefix>"
model_path = "/dbfs/models/model.h5"
s3_key = "{}/model.h5".format(s3_prefix)
s3_client.upload_file(model_path, s3_bucket, s3_key)
โ08-15-2023 08:02 AM
Hi @Kumaran ,
Could you please let me know how we would we load the same saved model back to a databricks notebook?
Thanks,
Manu
โ08-15-2023 08:38 AM
Hi @manupmanoos,
Thank you for posting your question in the Databricks community.
Below is an example of how to load the same saved model back to a Databricks notebook
โ08-15-2023 08:44 AM
Hi @Kumaran ,
Sorry, I was not clear with my question. How do we load the saved model back from the s3 bucket?
Thanks,
Manu
โ08-15-2023 10:35 AM
Hi @manupmanoos,
Please check the below code on how to load the saved model back from the s3 bucket
import boto3
import os
from keras.models import load_model
# Set credentials and create S3 client
aws_access_key_id = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
aws_secret_access_key = dbutils.secrets.get(scope="<scope-name>", key="<key-name>")
os.environ['AWS_ACCESS_KEY_ID'] = aws_access_key_id
os.environ['AWS_SECRET_ACCESS_KEY'] = aws_secret_access_key
s3_client = boto3.client('s3')
# Specify the S3 bucket and model file path
s3_bucket = "<bucket-name>"
s3_prefix = "<bucket-prefix>"
s3_key = "{}/model.h5".format(s3_prefix)
# Download the model file from S3
local_model_path = "/dbfs/models/model.h5"
s3_client.download_file(s3_bucket, s3_key, local_model_path)
# Load the model using Keras
loaded_model = load_model(local_model_path)
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now