cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

create notebook programatically

Avinash_Narala
New Contributor III

Hello,

I have json content of the notebook with me.

Can I know is there a way to create notebook with that content using python?

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @Avinash_Narala , You can use Python to convert JSON content into a DataFrame in Databricks.

To do this, you'll first convert the JSON content into a list of JSON strings, then parallelize the list to create an RDD, and finally use spark.read.json() to convert the RDD into a DataFrame. If you already have a DataFrame with a JSON column and want to extract and parse it, you can select the JSON column, convert it to an RDD of strings, and then parse it using spark.read.json().

Additionally, if you want to create a Databricks job with separate tasks and parameters programmatically, you can use the Databricks SDK, but this step is optional. 

Try this and let us know if this helps!

Avinash_Narala
New Contributor III

Hi Kaniz,

Thank you for your reply.

Actually I don't want to create dataframe. I have json content of notebook, which I got by reading a notebook itself, but I can't create a notebook from that json.
You can see below code to understand in detail what I want to do:

 

def export_notebook():
    export_payload = {
        "path": notebook_path,
        "format": "SOURCE"
    }
    response = requests.get(export_url, headers=headers, json=export_payload)
    response.raise_for_status()
    return response.json()["content"]
 
def import_notebook(new_content๐Ÿ˜ž
    import_payload = {
        "path": new_notebook_path,
        "content": new_content
    }
    response = requests.post(import_url, headers=headers, json=import_payload)
    if response.status_code != 200:
        print(response.content)
    response.raise_for_status()
 
I can export my notebook and make changes to json, but can't import back to my workspace.
 
Please help me with this.
Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.