cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Making py connector to raise an error for wrong SQL when asking to plan a query

624398
New Contributor III

Hey all,

My aim is to validate a given SQL string without actually running it.

I thought I could use the `EXPLAIN` statement to do so.

So I tried using the `databricks-sql-connector` for python to explain a query, and so determine whether it's valid or not. Example python code:

```

import databricks.sql

with databricks.sql.connect(...) as connection:

with connection.cursor() as cursor:

cursor.execute("EXPLAIN SELECT BAD-QUERY AS FOO")

r = cursor.fetchall()

```

The problem with that implementation is that the driver does not throws an error, but instead retrieves me a string containing the error details.

Why it's a problem? I need to parse the string result to distinguish if the explained query was valid or not.

So I was wondering if there some kind of setting / parameter / configuration or so I can use to change the described above result.

Many thanks in Advance!

1 ACCEPTED SOLUTION

Accepted Solutions

624398
New Contributor III

Hi @Hubert Dudek​, thanks for replying!

The optimal solution I was looking for was in the databricks-sql-connector.

Installing large package such pyspark for this specific feature seems too much for my project purposes.

Thank you very much!

View solution in original post

4 REPLIES 4

Hubert-Dudek
Esteemed Contributor III

Please try in pyspark:

try:
   spark.sql("SELECT  BAD-QUERY AS FOO")._jdf.queryExecution().toString()
except:
   print("incorrect query")

or just:

try:
   spark.sql("SELECT  BAD-QUERY AS FOO").explain()
except:
   print("incorrect query")

624398
New Contributor III

Hi @Hubert Dudek​, thanks for replying!

The optimal solution I was looking for was in the databricks-sql-connector.

Installing large package such pyspark for this specific feature seems too much for my project purposes.

Thank you very much!

Kaniz
Community Manager
Community Manager

Hi @Nativ Issac​, We haven’t heard from you on the last response from @Hubert Dudek​  , and I was checking back to see if his suggestions helped you. Or else, If you have any solution, please do share that with the community as it can be helpful to others.

624398
New Contributor III

Hi' I'm still searching for a more suited solution to my use case.

Installing pyspark seems too much (build time and extra packages I don't really need).

Is there a way to make this a feature request?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.