โ02-04-2022 09:04 AM
I would like connect Denodo VDP from databrick workspace
installed ODBC client and Installed denodo Jar in cluster ,not able to understanding other steps.
Could you please me
โ03-29-2022 05:07 AM
Hi @sathyanarayan kokkuโ ,
Once you've installed Denodo jar in the cluster, create a connection and execute a query in Notebook following these steps (more info can be found at SQL Databases using JDBC),
Class.forName("com.denodo.vdp.jdbc.Driver")
val jdbcHostname = *******
val jdbcPort = *****
val jdbcDatabase = *****
// Create the JDBC URL without passing in the user and password parameters.
val jdbcUrl = s"jdbc:vdb://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
// Create a Properties() object to hold the parameters.
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", "admin")
connectionProperties.put("password", "mypass")
val pushdown_query = "(select * from oracle_customers"
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
display(df)
Depending on the Databricks version when following these steps you might get a โNo suitable driverโ error. In such scenarios, the postgresql driver can be used as a workaround, this driver is loaded by default.
Class.forName("org.postgresql.Driver")
val jdbcHostname = *****
val jdbcPort = ****
val jdbcDatabase = *****
// Create the JDBC URL without passing in the user and password parameters.
val jdbcUrl = s"jdbc:postgresql://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
// Create a Properties() object to hold the parameters.
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", "admin")
connectionProperties.put("password", "mypass")
val pushdown_query = "(select * from oracle_customer) oracle_customer"
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
Console.println(pushdown_query)
display(df)
โ02-04-2022 11:05 AM
Hello, @sathyanarayan kokkuโ. My name is Piper, and I'm a moderator for Databricks. Welcome to the community and thank you for bringing us this challenge. We will give your peers a chance to respond and then we will circle back to you if necessary.
Thanks in advance for you patience!
โ02-04-2022 12:05 PM
Hi @sathyanarayan kokkuโ , To establish a successful connection with Azure Data bricks from the Virtual DataPort, use the default SIMBA JDBC driver (choose Database Adapter: Spark SQL 2. x Databricks) provided by Denodo. You could refer to the Knowledge base article How to connect to Azure Databricks from Denodo for detailed steps.
When providing authentication details, provide the login information in the JDBC URL or Login/password parameters of the JDBC data source as both options would work fine. For deployment options, use Solution Manager, as it is useful to deploy the VQL elements from one environment to another using the exported properties file which can help you to capture the connection parameters.
โ02-06-2022 08:26 PM
Thank for the reply ..I would like to expect : Connect Denodo from Databricks works space
The above details are for connect Databricks from denodo ( which is reverse )
Please hep me the same.
โ03-29-2022 05:07 AM
Hi @sathyanarayan kokkuโ ,
Once you've installed Denodo jar in the cluster, create a connection and execute a query in Notebook following these steps (more info can be found at SQL Databases using JDBC),
Class.forName("com.denodo.vdp.jdbc.Driver")
val jdbcHostname = *******
val jdbcPort = *****
val jdbcDatabase = *****
// Create the JDBC URL without passing in the user and password parameters.
val jdbcUrl = s"jdbc:vdb://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
// Create a Properties() object to hold the parameters.
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", "admin")
connectionProperties.put("password", "mypass")
val pushdown_query = "(select * from oracle_customers"
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
display(df)
Depending on the Databricks version when following these steps you might get a โNo suitable driverโ error. In such scenarios, the postgresql driver can be used as a workaround, this driver is loaded by default.
Class.forName("org.postgresql.Driver")
val jdbcHostname = *****
val jdbcPort = ****
val jdbcDatabase = *****
// Create the JDBC URL without passing in the user and password parameters.
val jdbcUrl = s"jdbc:postgresql://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
// Create a Properties() object to hold the parameters.
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", "admin")
connectionProperties.put("password", "mypass")
val pushdown_query = "(select * from oracle_customer) oracle_customer"
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
Console.println(pushdown_query)
display(df)
โ03-07-2022 08:23 PM
Hi @sathyanarayan kokkuโ
Are you trying to install denodo vdp server in databricks?
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.