@KARTHICK Nā :
Yes, you can use the same SparkSession instance across multiple notebooks in Databricks. Here's how you can do it:
In the first notebook where you create the SparkSession instance, assign it to a variable:
spark = SparkSession.builder.appName("my_app").getOrCreate()
In the second notebook where you want to use the same SparkSession, you can access the variable that contains the SparkSession instance by using the %run command:
%run "/path/to/first/notebook"
This will execute the first notebook and make all the variables defined in it available in the current notebook. So if you defined a variable called spark in the first notebook, you can access it in the second notebook after running the %run command.
Note that you need to provide the full path to the first notebook in the %run command, including the file extension (e.g. ipynb).Once you have access to the spark variable in the second notebook, you can use it just like you would in the first notebook:
df = spark.read.csv("/path/to/data.csv")
This will create a DataFrame using the same SparkSession instance that was created in the first notebook.
Keep in mind that when you use the %run command to access variables from another notebook, you are essentially importing those variables into the current notebook. So if you modify a variable in the second notebook, it will not affect the original variable in the first notebook. If you need to share data between notebooks in a way that allows you to modify it in one notebook and have those changes reflected in another notebook, you may want to consider using a shared database or file system instead.