02-04-2022 09:04 AM
I would like connect Denodo VDP from databrick workspace
installed ODBC client and Installed denodo Jar in cluster ,not able to understanding other steps.
Could you please me
03-29-2022 05:07 AM
Hi @sathyanarayan kokku ,
Once you've installed Denodo jar in the cluster, create a connection and execute a query in Notebook following these steps (more info can be found at SQL Databases using JDBC),
Class.forName("com.denodo.vdp.jdbc.Driver")
val jdbcHostname = *******
val jdbcPort = *****
val jdbcDatabase = *****
// Create the JDBC URL without passing in the user and password parameters.
val jdbcUrl = s"jdbc:vdb://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
// Create a Properties() object to hold the parameters.
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", "admin")
connectionProperties.put("password", "mypass")
val pushdown_query = "(select * from oracle_customers"
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
display(df)
Depending on the Databricks version when following these steps you might get a “No suitable driver” error. In such scenarios, the postgresql driver can be used as a workaround, this driver is loaded by default.
Class.forName("org.postgresql.Driver")
val jdbcHostname = *****
val jdbcPort = ****
val jdbcDatabase = *****
// Create the JDBC URL without passing in the user and password parameters.
val jdbcUrl = s"jdbc:postgresql://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
// Create a Properties() object to hold the parameters.
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", "admin")
connectionProperties.put("password", "mypass")
val pushdown_query = "(select * from oracle_customer) oracle_customer"
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
Console.println(pushdown_query)
display(df)
02-04-2022 11:05 AM
Hello, @sathyanarayan kokku. My name is Piper, and I'm a moderator for Databricks. Welcome to the community and thank you for bringing us this challenge. We will give your peers a chance to respond and then we will circle back to you if necessary.
Thanks in advance for you patience!
02-04-2022 12:05 PM
Hi @sathyanarayan kokku , To establish a successful connection with Azure Data bricks from the Virtual DataPort, use the default SIMBA JDBC driver (choose Database Adapter: Spark SQL 2. x Databricks) provided by Denodo. You could refer to the Knowledge base article How to connect to Azure Databricks from Denodo for detailed steps.
When providing authentication details, provide the login information in the JDBC URL or Login/password parameters of the JDBC data source as both options would work fine. For deployment options, use Solution Manager, as it is useful to deploy the VQL elements from one environment to another using the exported properties file which can help you to capture the connection parameters.
02-06-2022 08:26 PM
Thank for the reply ..I would like to expect : Connect Denodo from Databricks works space
The above details are for connect Databricks from denodo ( which is reverse )
Please hep me the same.
03-29-2022 05:07 AM
Hi @sathyanarayan kokku ,
Once you've installed Denodo jar in the cluster, create a connection and execute a query in Notebook following these steps (more info can be found at SQL Databases using JDBC),
Class.forName("com.denodo.vdp.jdbc.Driver")
val jdbcHostname = *******
val jdbcPort = *****
val jdbcDatabase = *****
// Create the JDBC URL without passing in the user and password parameters.
val jdbcUrl = s"jdbc:vdb://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
// Create a Properties() object to hold the parameters.
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", "admin")
connectionProperties.put("password", "mypass")
val pushdown_query = "(select * from oracle_customers"
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
display(df)
Depending on the Databricks version when following these steps you might get a “No suitable driver” error. In such scenarios, the postgresql driver can be used as a workaround, this driver is loaded by default.
Class.forName("org.postgresql.Driver")
val jdbcHostname = *****
val jdbcPort = ****
val jdbcDatabase = *****
// Create the JDBC URL without passing in the user and password parameters.
val jdbcUrl = s"jdbc:postgresql://${jdbcHostname}:${jdbcPort}/${jdbcDatabase}"
// Create a Properties() object to hold the parameters.
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", "admin")
connectionProperties.put("password", "mypass")
val pushdown_query = "(select * from oracle_customer) oracle_customer"
val df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
Console.println(pushdown_query)
display(df)
03-07-2022 08:23 PM
Hi @sathyanarayan kokku
Are you trying to install denodo vdp server in databricks?