cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Databricks to Databricks connection

RK_AV
New Contributor III

How do you connect to Azure Databricks instance from another Databricks instance? 

I needed to access (database) Views created in a Databricks instance from a Pyspark notebook running in another Databricks instance. 

Appreciate if anyone has any sample code to connect and query views? 

1 ACCEPTED SOLUTION

Accepted Solutions

Kaniz_Fatma
Community Manager
Community Manager

Hi @Venkata Ramakrishna Alvakondaโ€‹ , The two ways of executing a notebook within another notebook in Databricks are:-

  • Method #[1: %run command]โ€‹ 

The first and the most straightforward way of executing another notebook is by using the %run command. Executing %run [notebook] extracts the entire content of the specified notebook, pastes it in the place of this %run

command and executes it. The specified notebook is executed in the scope of the main notebook, which means that all variables already defined in the main notebook prior to the execution of the second notebook can be accessed in the second notebook. And, vice-versa, all functions and variables defined in the executed notebook can be then used in the current notebook.

This approach allows you to concatenate various notebooks easily. On the other hand, there is no explicit way how to pass parameters to the second notebook, however, you can use variables already declared in the main notebook.

Note that %run must be written in a separate cell, otherwise you wonโ€™t be able to execute it.

  • Method #[2: Dbutils.notebook.run command]โ€‹ 

The other and more complex approach consists of executing the 

dbutils.notebook.run command. In this case, a new instance of the executed notebook is created and the computations are done within it, in its own scope, and completely aside from the main notebook. This means that no functions and variables you define in the executed notebook can be reached from the main notebook. On the other hand, this might be a plus if you donโ€™t want functions and variables to get unintentionally overridden.

The benefit of this way is that you can directly pass parameter values to the executed notebook and also create alternate workflows according to the exit value returned once the notebook execution finishes. This comes in handy when creating more complex solutions.

The dbutils.notebook.run command accepts three parameters:

  • path: relative path to the executed notebook
  • timeout (in seconds): kill the notebook in case the execution time exceeds the given timeout
  • arguments: a dictionary of arguments that are passed to the executed notebook, must be implemented as widgets in the executed notebook

You can find the examples in the Source link.

View solution in original post

2 REPLIES 2

Anonymous
Not applicable

Hi there, @Venkata Ramakrishna Alvakondaโ€‹! My name is Piper, and I'm a moderator for the community. Thank you for your great question! Let's give the community a chance to respond first, and then we'll circle back around.

If the community's response answers your question, be sure to mark the most helpful answer as best so that other members can find the solution more quickly.

Let us know how it goes. ๐Ÿ™‚

Kaniz_Fatma
Community Manager
Community Manager

Hi @Venkata Ramakrishna Alvakondaโ€‹ , The two ways of executing a notebook within another notebook in Databricks are:-

  • Method #[1: %run command]โ€‹ 

The first and the most straightforward way of executing another notebook is by using the %run command. Executing %run [notebook] extracts the entire content of the specified notebook, pastes it in the place of this %run

command and executes it. The specified notebook is executed in the scope of the main notebook, which means that all variables already defined in the main notebook prior to the execution of the second notebook can be accessed in the second notebook. And, vice-versa, all functions and variables defined in the executed notebook can be then used in the current notebook.

This approach allows you to concatenate various notebooks easily. On the other hand, there is no explicit way how to pass parameters to the second notebook, however, you can use variables already declared in the main notebook.

Note that %run must be written in a separate cell, otherwise you wonโ€™t be able to execute it.

  • Method #[2: Dbutils.notebook.run command]โ€‹ 

The other and more complex approach consists of executing the 

dbutils.notebook.run command. In this case, a new instance of the executed notebook is created and the computations are done within it, in its own scope, and completely aside from the main notebook. This means that no functions and variables you define in the executed notebook can be reached from the main notebook. On the other hand, this might be a plus if you donโ€™t want functions and variables to get unintentionally overridden.

The benefit of this way is that you can directly pass parameter values to the executed notebook and also create alternate workflows according to the exit value returned once the notebook execution finishes. This comes in handy when creating more complex solutions.

The dbutils.notebook.run command accepts three parameters:

  • path: relative path to the executed notebook
  • timeout (in seconds): kill the notebook in case the execution time exceeds the given timeout
  • arguments: a dictionary of arguments that are passed to the executed notebook, must be implemented as widgets in the executed notebook

You can find the examples in the Source link.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group