cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Not able to connecting Denodo VDP from databricks

databrick_comm
New Contributor II

I would like connect Denodo VDP from databrick workspace 

installed ODBC client and Installed denodo Jar in cluster ,not able to understanding other steps.

Could you please me

1 ACCEPTED SOLUTION

Accepted Solutions

Hi @sathyanarayan kokkuโ€‹ ,

Once you've installed Denodo jar in the cluster, create a connection and execute a query in Notebook following these steps (more info can be found at SQL Databases using JDBC),

Class.forName("com.denodo.vdp.jdbc.Driver")
 
 
val jdbcHostname = *******
 
val jdbcPort = *****
 
val jdbcDatabase = *****
 
 
// Create the JDBC URL without passing in the user and password parameters.
 
val jdbcUrl = s"jdbc:vdb://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
 
 
// Create a Properties() object to hold the parameters.
 
import java.util.Properties
 
val connectionProperties = new Properties()
 
 
connectionProperties.put("user", "admin")
 
connectionProperties.put("password", "mypass")
 
 
val pushdown_query = "(select * from oracle_customers"
 
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
 
display(df)
 

Depending on the Databricks version when following these steps you might get a โ€œNo suitable driverโ€ error. In such scenarios, the postgresql driver can be used as a workaround, this driver is loaded by default.

Class.forName("org.postgresql.Driver")
 
 
val jdbcHostname = *****
 
val jdbcPort = ****
 
val jdbcDatabase = *****
 
 
// Create the JDBC URL without passing in the user and password parameters.
 
val jdbcUrl = s"jdbc:postgresql://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
 
 
// Create a Properties() object to hold the parameters.
 
import java.util.Properties
 
val connectionProperties = new Properties()
 
 
connectionProperties.put("user", "admin")
 
connectionProperties.put("password", "mypass")
 
 
val pushdown_query = "(select * from oracle_customer) oracle_customer"
 
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
 
Console.println(pushdown_query)
 
display(df)

Source

View solution in original post

5 REPLIES 5

Anonymous
Not applicable

Hello, @sathyanarayan kokkuโ€‹. My name is Piper, and I'm a moderator for Databricks. Welcome to the community and thank you for bringing us this challenge. We will give your peers a chance to respond and then we will circle back to you if necessary.

Thanks in advance for you patience!

Kaniz
Community Manager
Community Manager

Hi @sathyanarayan kokkuโ€‹ , To establish a successful connection with Azure Data bricks from the Virtual DataPort, use the default SIMBA JDBC driver (choose Database Adapter: Spark SQL 2. x Databricks) provided by Denodo. You could refer to the Knowledge base article How to connect to Azure Databricks from Denodo for detailed steps.

When providing authentication details, provide the login information in the JDBC URL or Login/password parameters of the JDBC data source as both options would work fine. For deployment options, use Solution Manager, as it is useful to deploy the VQL elements from one environment to another using the exported properties file which can help you to capture the connection parameters.

Thank for the reply ..I would like to expect : Connect Denodo from Databricks works space

The above details are for connect Databricks from denodo ( which is reverse )

Please hep me the same.

Hi @sathyanarayan kokkuโ€‹ ,

Once you've installed Denodo jar in the cluster, create a connection and execute a query in Notebook following these steps (more info can be found at SQL Databases using JDBC),

Class.forName("com.denodo.vdp.jdbc.Driver")
 
 
val jdbcHostname = *******
 
val jdbcPort = *****
 
val jdbcDatabase = *****
 
 
// Create the JDBC URL without passing in the user and password parameters.
 
val jdbcUrl = s"jdbc:vdb://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
 
 
// Create a Properties() object to hold the parameters.
 
import java.util.Properties
 
val connectionProperties = new Properties()
 
 
connectionProperties.put("user", "admin")
 
connectionProperties.put("password", "mypass")
 
 
val pushdown_query = "(select * from oracle_customers"
 
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
 
display(df)
 

Depending on the Databricks version when following these steps you might get a โ€œNo suitable driverโ€ error. In such scenarios, the postgresql driver can be used as a workaround, this driver is loaded by default.

Class.forName("org.postgresql.Driver")
 
 
val jdbcHostname = *****
 
val jdbcPort = ****
 
val jdbcDatabase = *****
 
 
// Create the JDBC URL without passing in the user and password parameters.
 
val jdbcUrl = s"jdbc:postgresql://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
 
 
// Create a Properties() object to hold the parameters.
 
import java.util.Properties
 
val connectionProperties = new Properties()
 
 
connectionProperties.put("user", "admin")
 
connectionProperties.put("password", "mypass")
 
 
val pushdown_query = "(select * from oracle_customer) oracle_customer"
 
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
 
Console.println(pushdown_query)
 
display(df)

Source

User16753724663
Valued Contributor

Hi @sathyanarayan kokkuโ€‹ 

Are you trying to install denodo vdp server in databricks?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.