cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Attach notebook to cluster via REST API

akihiko
New Contributor III

Is it possible to attach a notebook to cluster and run it via the REST API?

The closest approach I have found is to run a notebook, export the results (HTML!) and import it into the workspace again, but this does not allow us to retain the original execution context. The 1.2 API allows for the creation and manipulation of execution contexts, but I can't figure out how to actual attach a notebook.

My use-case is that we have notebook templates for security incidents that contain some pre-defined cells/queries. We want the system to run these queries before handing the notebook off to an analyst where she can do further analysis.

1 ACCEPTED SOLUTION

Accepted Solutions

Hi @Vivian Wilfredโ€‹, I did try that, but that won't allow you to keep the execution context, so it only partially works.

I guess for now I will use this API and export the executed notebook.

View solution in original post

4 REPLIES 4

Ajay-Pandey
Esteemed Contributor III

Hi @Akihiko Nagataโ€‹ ,

I think for now we don't have such option available through databricks API.

Ajay Kumar Pandey

Vivian_Wilfred
Databricks Employee
Databricks Employee

Hi @Akihiko Nagataโ€‹ , have you checked the jobs API? You can run a job on the existing cluster that can use the notebook of concern. I believe this is the only way.

https://docs.databricks.com/dev-tools/api/latest/jobs.html#operation/JobsRunsSubmit

Hi @Vivian Wilfredโ€‹, I did try that, but that won't allow you to keep the execution context, so it only partially works.

I guess for now I will use this API and export the executed notebook.

baert23
New Contributor II

I'm looking for a way to programmatically copy a notebook in Databricks using the workspace/export and workspace/import APIs. Once the notebook is copied, I want to automatically attach it to a specific cluster using its cluster ID. The challenge is that I can't use Jobs because the notebook contains widgets that require user interaction.

Is there a way to achieve this, or does anyone know of a workaround to programmatically attach a notebook to a cluster after it has been imported?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group