cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Connect to spark session and uc tables in python file

ajay_wavicle
Databricks Partner

How to Connect to spark session and uc tables in python file. I want to read uc tables in python modules in databricks workspace. How to access the current sparksession

 

1 ACCEPTED SOLUTION

Accepted Solutions

szymon_dybczak
Esteemed Contributor III

Hi @ajay_wavicle ,

Azure Databricks automatically creates a SparkContext for each compute cluster, and creates an isolated SparkSession for each notebook or job executed against the cluster. 

So following should work in python module in Databricks Workspace (use spark variable):

col_df = spark.sql("SELECT * FROM  dev_factoring.information_schema.columns")
display(col_df)

szymon_dybczak_0-1770124005571.png

Another approach would be to create session explicityl using following:

from pyspark.sql import SparkSession

spark_session = SparkSession.builder.getOrCreate()
col_df = spark_session.sql("SELECT * FROM  dev_factoring.information_schema.columns")
display(col_df)

View solution in original post

3 REPLIES 3

szymon_dybczak
Esteemed Contributor III

Hi @ajay_wavicle ,

Azure Databricks automatically creates a SparkContext for each compute cluster, and creates an isolated SparkSession for each notebook or job executed against the cluster. 

So following should work in python module in Databricks Workspace (use spark variable):

col_df = spark.sql("SELECT * FROM  dev_factoring.information_schema.columns")
display(col_df)

szymon_dybczak_0-1770124005571.png

Another approach would be to create session explicityl using following:

from pyspark.sql import SparkSession

spark_session = SparkSession.builder.getOrCreate()
col_df = spark_session.sql("SELECT * FROM  dev_factoring.information_schema.columns")
display(col_df)

hi @szymon_dybczak , I am talking about connecting in plain python file and not databricks notebook dbc

szymon_dybczak
Esteemed Contributor III

Look carefully @ajay_wavicle , this is plain old python file. Do you see any notebook cells on screenshot?

szymon_dybczak_0-1770127131337.png