cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to execute stored procedures on synapse sql pool from databricks

doodateika
New Contributor III

In the current version of databricks, previous methods to execute stored procedures seem to fail. 

spark.sparkContext._gateway.jvm.java.sql.DriverManager/
spark._sc._gateway.jvm.java.sql.DriverManager returns that it is JVM dependent and will not work. Using pyodbc does not currently like it used to because drivers must be installed and using init scripts at DBFS seems to be deprecated in addition to authentication passthrough for storing the init scripts at ABFSS. So my question is: What is the current way to execute synapse stored procedures from databricks? Do I have it wrong and the methods above actually work? Is there some new method that i have missed in the docs?
1 ACCEPTED SOLUTION

Accepted Solutions

Hi, my issue was "resolved" when i switched to personal compute with 14.3 LTS ML Runtime. OAuth did not seem to make a difference since i still needed to use drivermanager at the end of the day. So if you need a temporary solution i would suggest trying that, but i do hope that there are plans for fixing this in the future.

# Write your SQL statement as a string
statement = """TRUNCATE TABLE dbo.test"""

# Fetch the driver manager from your spark context
driver_manager = spark._sc._gateway.jvm.java.sql.DriverManager

# Create a connection object using a jdbc-url, + sql uname & pass
con = driver_manager.getConnection(jdbc_url, user, password)

# Create callable statement and execute it
exec_statement = con.prepareCall(statement)
exec_statement.execute()

# Close connections
exec_statement.close()
con.close()

View solution in original post

5 REPLIES 5

Kaniz_Fatma
Community Manager
Community Manager

Hi @doodateikaYou can use the built-in JDBC connector with OAuth2 authentication to execute stored procedures. Another approach is to use Databricks SQL Warehouses to execute your stored procedures. 

Hi thanks for responding, how would i go about doing that? Do you have anything with more detail? Im glad to try whatever suggestions you have! ๐Ÿ™‚

javiermd13
New Contributor II

I'm facing the same issue @Kaniz_Fatma , is there a working solution to execute a stored procedure? 

Hi, my issue was "resolved" when i switched to personal compute with 14.3 LTS ML Runtime. OAuth did not seem to make a difference since i still needed to use drivermanager at the end of the day. So if you need a temporary solution i would suggest trying that, but i do hope that there are plans for fixing this in the future.

# Write your SQL statement as a string
statement = """TRUNCATE TABLE dbo.test"""

# Fetch the driver manager from your spark context
driver_manager = spark._sc._gateway.jvm.java.sql.DriverManager

# Create a connection object using a jdbc-url, + sql uname & pass
con = driver_manager.getConnection(jdbc_url, user, password)

# Create callable statement and execute it
exec_statement = con.prepareCall(statement)
exec_statement.execute()

# Close connections
exec_statement.close()
con.close()

-werners-
Esteemed Contributor III

can you create a connection to external data in unity catalog, and then:

use <connectiondb>;
exec <sp>

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group