cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to read a sql notebook in python notebook on workspace

NavyaD
New Contributor III

I have a notebook named ecom_sellout.sql under the path notebooks/python/dataloader/queries.

I have another notebook(named dataloader under the path notebooks/python/dataloader) in which I am calling this sql notebook.

My code runs perfectly fine on repo, however I would like to run it in workspace & schedule the jobs through workspace notebooks

I am sharing the piece of code here

def query_database_with_params(self, path: str, table_name: str, **kwargs) -> DataFrame:

    """

    Load a SQL statement from a file and query the dataset in spark, specifying table name and 

    country to filter the table

     

    Args:

      path (str): String specifying the location of the SQL statement

      table_name (str): String specifying the name of the table

      **kwargs: Arguments which store strings to fill the parameters oft the sql file

    Returns:

      Pyspark Dataframe: Use parameterized SQL to load dataframe into spark

    """

    with open(path, 'r') as f:

      query = f.read()

    query = query.format(table_name=table_name, **kwargs)

    return self.spark.sql(query)

The function is failing to open the path & says that the file or directory doesn't exist

imageHowever in the same location the file is present.

Can someone please explain how to read a sql file on databricks workspace

2 REPLIES 2

Aviral-Bhardwaj
Esteemed Contributor III

use magic commands and other hand you can use python and SQL formatted there. It will work

The function is under a class & Can you please give a brief explanation on how I can use magic commands, also what do you mean by python & sql formater.