cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

SebastianM
by New Contributor
  • 957 Views
  • 1 replies
  • 0 kudos

JDBC to delta lake: Is setting fetch size expected to be effective?

I am using the databricks jdbc driver to access a delta lake. The database URL specifies transportMode=http. I have experimented with setting different values of fetchSize on the java.sqlPreparedStatement object and have monitored memory use within m...

  • 957 Views
  • 1 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

I think there is one spark configuration but I forgot right now Pelase try to utilized this doc maybe you get something- https://spark.apache.org/docs/latest/configuration.html

  • 0 kudos
elgeo
by Valued Contributor II
  • 7039 Views
  • 7 replies
  • 4 kudos

Resolved! Invalid JDBC url

Hello. I am trying to establish a connection between DBeaver and Databricks. I followed the steps in DBeaver integration with Databricks | Databricks on AWS, but I get the following error while testing the connection: Could anyone provide any insight...

jdbc_url_error
  • 7039 Views
  • 7 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @ELENI GEORGOUSI​ Glad to hear! It's a request that mark an answer as best.Thanks...

  • 4 kudos
6 More Replies
mattmunz
by New Contributor III
  • 1987 Views
  • 1 replies
  • 4 kudos

JDBC Error: Error occured while deserializing arrow data

I am getting the following error in my Java application.java.sql.SQLException: [Databricks][DatabricksJDBCDriver](500618) Error occured while deserializing arrow data: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not availableI beli...

  • 1987 Views
  • 1 replies
  • 4 kudos
Latest Reply
User16753725469
Contributor II
  • 4 kudos

Please try adding the below(--add-opens flag) java command line flags in your jvm call:% javac -classpath SparkJDBC42Example.jar:. jdbc_example.java                   % java --add-opens=java.base/java.nio=ALL-UNNAMED -classpath SparkJDBC42Example.jar...

  • 4 kudos
Reda
by New Contributor II
  • 1235 Views
  • 1 replies
  • 6 kudos

Creating a DLT pipeline that reads from a JDBC source

Hey,I'm trying to create a DLT pipeline that reads from a JDBC source, and the code I'm using looks something like this in python:import dlt @dlt.table def table_name(): driver = 'oracle.jdbc.driver.OracleDriver' url = '...' query = 'SELECT ......

  • 1235 Views
  • 1 replies
  • 6 kudos
Latest Reply
Anonymous
Not applicable
  • 6 kudos

Hi @Reda Bitar​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question first. Or else bricksters will get back to you soon. Thanks

  • 6 kudos
Trey
by New Contributor III
  • 1328 Views
  • 2 replies
  • 3 kudos

Where do you usually store and manage "JDBC credentials" to use on databricks notebook?

Hi all,I would like to improve the way I use JDBC credenditial information (ID/PW, host, port, etc)Where do you guys usually store and use the jdbc credentials?Thanks for your help in advance!

  • 1328 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Kwangwon Yi​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...

  • 3 kudos
1 More Replies
Cano
by New Contributor III
  • 766 Views
  • 1 replies
  • 2 kudos

How to add notebook to my Databricks jdbc url?

Please how do I add a notebook to the jdbc url in order to run queries externally?jdbc:databricks://dbc-a1b2345c-d6e7.cloud.databricks.com:443/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/1234567890123456/1234-567890-reef123;AuthMech=3;...

  • 766 Views
  • 1 replies
  • 2 kudos
Latest Reply
ranged_coop
Valued Contributor II
  • 2 kudos

Not sure if it is possible.Alternatively you could try adding your notebook to a job, and then triggering that job via jobs api.Please refer below link Jobs API 2.1 | Databricks on AWS

  • 2 kudos
isaac_gritz
by Valued Contributor II
  • 748 Views
  • 1 replies
  • 3 kudos

Connecting Applications and BI Tools to Databricks SQL

Access Data in Databricks Using an Application or your Favorite BI ToolYou can leverage Partner Connect for easy, low-configuration connections to some of the most popular BI tools through our optimized connectors. Alternatively, you can follow these...

  • 748 Views
  • 1 replies
  • 3 kudos
Latest Reply
Kaniz
Community Manager
  • 3 kudos

Thank you, @Isaac Gritz​ , for sharing this fantastic post!

  • 3 kudos
sage5616
by Valued Contributor
  • 5044 Views
  • 5 replies
  • 5 kudos

Resolved! SQL Error when querying any tables/views on a Databricks cluster via Dbeaver.

I am able to connect to the cluster, browse its hive catalog, see tables/views and columns/datatypesRunning a simple select statement from a view on a parquet file produces this error and no other results:"SQL Error [500540] [HY000]: [Databricks][Dat...

  • 5044 Views
  • 5 replies
  • 5 kudos
Latest Reply
sage5616
Valued Contributor
  • 5 kudos

Update. I have tried SQL Workbench/J and encountered exactly the same error(s) as with Dbeaver. I have also tried JetBrains DataGrip and it worked flawlessly. Able to connect, browse the databases and query tables/views. https://docs.microsoft.com/en...

  • 5 kudos
4 More Replies
dataAllMyLife
by New Contributor
  • 682 Views
  • 1 replies
  • 0 kudos

JDBC Connection closes between 'stmt.execute( ... ) and stmt.executeQuery( ... )

I'm running a Java application that registers a CSV table with HIVE and then checks the number of rows imported. Its done in several steps.:Statement stmt = con.createStatement();....stmt.execute( "CREATE TABLE ( <definition> < > );.....ResultSet rs...

  • 682 Views
  • 1 replies
  • 0 kudos
Latest Reply
Noopur_Nigam
Valued Contributor II
  • 0 kudos

@Reto Matter​  Are you running a jar job or using dbconnect to run java code? Please provide how are you trying to make a connection and full exception stack trace.

  • 0 kudos
Ashley1
by Contributor
  • 1763 Views
  • 2 replies
  • 0 kudos

Resolved! JDBC Connectivity via workspace url when No Public IP selected.

Hi All, I think I might be missing something in regard to No Pubic IP Clusters. I have set this option on a workspace (Azure) and setup the appropriate subnets. To my surprise, when I went to setup a JDBC connection to the cluster the JDBC connec...

  • 1763 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hey there @Ashley Betts​ Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too.Cheers!

  • 0 kudos
1 More Replies
mortenhaga
by Contributor
  • 1404 Views
  • 2 replies
  • 2 kudos

Resolved! Importing python function with spark.read.jdbc in to Repos

Hi all!Before we used Databricks Repos we used the run magic to run various utility python functions from one notebook inside other notebooks, fex like reading from a jdbc connections. We now plan to switch to repos to utilize the fantastic CI/CD pos...

image.png image
  • 1404 Views
  • 2 replies
  • 2 kudos
Latest Reply
mortenhaga
Contributor
  • 2 kudos

Thats...odd. I was sure I had tried that, but now it works somehow. I guess it has to be that now I did it with double quotation marks. Thanks anyway! Works like a charm.

  • 2 kudos
1 More Replies
findinpath
by Contributor
  • 3683 Views
  • 4 replies
  • 4 kudos

Resolved! Databricks 2.6.25 JDBC driver can't create tables with `GENERATED` columns

I'm using the Databricks JDBC driver recently made available via Maven:https://mvnrepository.com/artifact/com.databricks/databricks-jdbc/2.6.25While trying to create a table with `GENERATED` columns I receive the following exception:Caused by: java.s...

  • 3683 Views
  • 4 replies
  • 4 kudos
Latest Reply
findinpath
Contributor
  • 4 kudos

I was under the impression that this has been recognised as a BUG and is being handled by Databricks.What do I need to do for reporting the issue officially as a BUG?

  • 4 kudos
3 More Replies
sriramkumar
by New Contributor II
  • 1921 Views
  • 3 replies
  • 0 kudos

New Databricks Driver gives SQLNonTransientConnectionException when trying to connect to Databricks Instance

import com.databricks.client.jdbc.DataSource;   import java.sql.*;   public class testDatabricks { public static void main(String[] args) throws SQLException { String dbUrl = "jdbc:databricks://<hostname>:443;HttpPath=<HttpPath>;"; // Cop...

  • 1921 Views
  • 3 replies
  • 0 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 0 kudos

This looks like due to maintenance on US . Are you still facing the issue @Sriramkumar Thamizharasan​ Is your workspace on eastus and eastus2 ?

  • 0 kudos
2 More Replies
knight007
by New Contributor II
  • 1879 Views
  • 8 replies
  • 5 kudos

Containerized Databricks/Spark database

Hello. I'm fairly new to Databricks and Spark.I have a requirement to connect to Databricks using JDBC and that works perfectly using the driver I downloaded from the Databricks website ("com.simba.spark.jdbc.Driver")What I would like to do now is ha...

  • 1879 Views
  • 8 replies
  • 5 kudos
Latest Reply
Kaniz
Community Manager
  • 5 kudos

Hi @Gurps Bassi​ , Just a friendly follow-up. Do you still need help, or do @Hubert Dudek (Customer)​ and @Werner Stinckens​ 's responses help you find the solution? Please let us know.

  • 5 kudos
7 More Replies
tz1
by New Contributor III
  • 11806 Views
  • 13 replies
  • 7 kudos

Resolved! Problem with Databricks JDBC connection: Error occured while deserializing arrow data

I have a Java program like this to test out the Databricks JDBC connection with the Databricks JDBC driver. Connection connection = null; try { Class.forName(driver); connection = DriverManager.getConnection(url...

  • 11806 Views
  • 13 replies
  • 7 kudos
Latest Reply
Alice__Caterpil
New Contributor III
  • 7 kudos

Hi @Jose Gonzalez​ ,This similar issue in snowflake in JDBC is a good reference, I was able to get this to work in Java OpenJDK 17 by having this JVM option specified:--add-opens=java.base/java.nio=ALL-UNNAMEDAlthough I came across another issue with...

  • 7 kudos
12 More Replies
Labels