cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to register a JDBC Spark dialect in Python?

User16765131552
Contributor III

I am trying to read from a databricks table. I have used the url from a cluster in the databricks. I am getting this error:

 java.sql.SQLDataException: [Simba][JDBC](10140) Error converting value to int.

After these statements:

jdbcConnUrl= "jdbc:spark://adb....."
testquery="(select * from db.table limit 3)"
testdf=spark.read.format("jdbc").option("url", jdbcConnUrl).option("dbtable", testquery).option("fetchsize", "10000").load()
testdf.show()

I have come across all Scala solutions for this issue but I am using python. I want a python equivalent of this code:

import org.apache.spark.sql.jdbc.{JdbcDialect, JdbcDialects}
JdbcDialects.registerDialect(new JdbcDialect() {
override def canHandle(url: String): Boolean = url.toLowerCase.startsWith("jdbc:spark:")
override
def quoteIdentifier(column: String): String = column
})

5 REPLIES 5

imstwz1
New Contributor II

Hi @Brad Powellโ€‹, Were you able to solve this issue in Python. I'm also struck with this issue in python, solution is available only in Scala and need this solution in python. Could you please help me in solving this issue. Thanks

Meghala
Valued Contributor II

It was helpfull thank youโ€‹

Yadu
New Contributor II

Hi @Brad Powellโ€‹ / @Kaniz Fatmaโ€‹ / @S Meghalaโ€‹ - Any update on this?

KKDataEngineer
New Contributor III

is there a solution for this?

@Retired_mod 

I was able to solve this 

  •  Add this code into a simple scala class object method 
  • package it into a JAR file
  • now install this JAR file on the cluster where you execute the JDBC code.
  • add the below line of code  before executing the JDBC code in you pyspark code. this will execute that class and method from scala class in your JVM directly.
    spark.sparkContext._jvm.<scalaclass fully qualified>.<method>
     

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group