Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
I am working on writing a large amount of data from Databricks to an external SQL server using a JDB connection. I keep getting timeout errors/connection lost but digging deeper it appears to be a memory problem. I am wondering what cluster configura...
I use below code to connect to postgresql.
df = spark.read \
.jdbc("jdbc:postgresql://hostname:5432/dbname", "schema.table",
properties={"user": "user", "password": "password"})\
.load()
df.printSchema()
However, I got the ...
What are the steps needed to connect to a DB2-AS400 source to pull data to lake using Databricks? I believe it requires establishing a jdbc connection, but I couldnot find much details online
Hi @Ajay Menon Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks!
I am using the databricks JDBC driver (https://databricks.com/spark/jdbc-drivers-download) to connect to Azure databricks.The connection needs to be routed through a HTTP proxy. I found parameters that can be configured for using the HTTP proxy:By pa...
Hi @Jonas Minning , actually I am also having the same issue and when i looked into the driver related documentation I found that the driver currently only supports SOCKS proxies and I believe this is the reason why we are getting this error. So, I ...
Hi everyone! I have a question. For a project I need to establish a jdbc connection using spark.read. My question is when does the connection is deleted. That is because I will read multiple tables from that database, so if I could just create a conn...
Hello. I am trying to establish a connection between DBeaver and Databricks. I followed the steps in DBeaver integration with Databricks | Databricks on AWS, but I get the following error while testing the connection: Could anyone provide any insight...
Hi all,I would like to improve the way I use JDBC credenditial information (ID/PW, host, port, etc)Where do you guys usually store and use the jdbc credentials?Thanks for your help in advance!
Hi @Kwangwon Yi Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...
I'm running a Java application that registers a CSV table with HIVE and then checks the number of rows imported. Its done in several steps.:Statement stmt = con.createStatement();....stmt.execute( "CREATE TABLE ( <definition> < > );.....ResultSet rs...
@Reto Matter Are you running a jar job or using dbconnect to run java code? Please provide how are you trying to make a connection and full exception stack trace.
Hi all!Before we used Databricks Repos we used the run magic to run various utility python functions from one notebook inside other notebooks, fex like reading from a jdbc connections. We now plan to switch to repos to utilize the fantastic CI/CD pos...
Thats...odd. I was sure I had tried that, but now it works somehow. I guess it has to be that now I did it with double quotation marks. Thanks anyway! Works like a charm.
Hello all, I'm trying to pull table data from databricks tables that contain foreign language characters in UTF-8 into an ETL tool using a JDBC connection. I'm using the latest Simba Spark JDBC driver available from the Databricks website.The issue i...
Can you try setting UseUnicodeSqlCharacterTypes=1 in the driver, and also make sure 'file.encoding' is set to UTF-8 in jvm and see if the issue still persists?
What's the best way to federate a query to delta lake or the databricks from presto sql without having to create external tables? PrestoSQL doesn't have access to S3. Can PrestoSQL be configured with jdbc driver or plugin?