cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to Copy the notebooks from one environment to another environment

databicky
Contributor II

I have one requirement to copy the notebooks from one environment to another environment by one notebook automatically , how can I achieve it.

 

 

1 REPLY 1

Isi
Contributor

Hey @databicky ,

You can automate the process of copying notebooks from one Databricks environment to another using the Databricks REST API within a notebook. I show you the easiest way I found to do it

import json
import requests
import base64

# ==============================
#  CONFIGURATION: Databricks Details
# ==============================

# Source Databricks details
SOURCE_DATABRICKS_HOST = "https://<source-workspace>.cloud.databricks.com"
SOURCE_TOKEN = "dapiXXXXXXXXXXXXXXXXXXXXXXXXX"
NOTEBOOK_PATH = "/Workspace/Users/<source-user>/notebook_name"

# Destination Databricks details
DEST_DATABRICKS_HOST = "https://<destination-workspace>.cloud.databricks.com"
DEST_TOKEN = "dapiYYYYYYYYYYYYYYYYYYYYYYYYY"
DEST_NOTEBOOK_PATH = "/Workspace/Users/<destination-user>/notebook_name"

# ==============================
#  STEP 1: EXPORT NOTEBOOK FROM SOURCE
# ==============================
export_url = f"{SOURCE_DATABRICKS_HOST}/api/2.0/workspace/export"
headers = {"Authorization": f"Bearer {SOURCE_TOKEN}"}
params = {"path": NOTEBOOK_PATH, "format": "SOURCE"}

response = requests.get(export_url, headers=headers, params=params)

if response.status_code == 200:
    notebook_data = response.json()
    
    # Decode Base64 content
    notebook_content = base64.b64decode(notebook_data["content"]).decode("utf-8")
    
    print("\nNotebook exported successfully!")
    
    # Save locally as a file for reference (optional)
    with open("exported_notebook.py", "w", encoding="utf-8") as f:
        f.write(notebook_content)

    print("Decoded content saved as 'exported_notebook.py'\n")

else:
    print(f"Failed to export: {response.status_code} - {response.text}")
    exit()

# ==============================
#  STEP 2: IMPORT NOTEBOOK TO DESTINATION
# ==============================

# Encode content in Base64 for importing
encoded_content = base64.b64encode(notebook_content.encode()).decode()

import_url = f"{DEST_DATABRICKS_HOST}/api/2.0/workspace/import"
headers = {"Authorization": f"Bearer {DEST_TOKEN}"}
data = {
    "path": DEST_NOTEBOOK_PATH,
    "format": "SOURCE",
    "content": encoded_content,
    "language": "PYTHON",
}

print(f"Sending request to import notebook to: {DEST_NOTEBOOK_PATH}")

response = requests.post(import_url, headers=headers, json=data)

if response.status_code == 200:
    print("\nNotebook imported successfully to the destination workspace!")
else:
    print(f"\nFailed to import: {response.status_code} - {response.text}")

 By the way you dont have to run it in a notebook, can run locally.

I tested and works 🙂

Isi

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group