I was able to solve the problem! the problem was because the driver was missing and so pyodbc or sqlAlchemy can't find it. So I used the native Java API and it is working.
This is the example code:
jdbcUsername = "username"
jdbcPassword = "password"
driverClass = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbcUrl = "jdbc:sqlserver://prtc.database.windows.net:1433;database=db02"
connectionProperties = {
"user": jdbcUsername,
"password": jdbcPassword,
"driver": driverClass
}
connection = spark._jvm.java.sql.DriverManager.getConnection(jdbcUrl, jdbcUsername, jdbcPassword)
stmt = connection.createStatement()
sql = "delete from dbo.ExampleTable where ID = 2"
stmt.execute(sql)
connection.close()