cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

db-avengers2rul
by Contributor II
  • 1313 Views
  • 1 replies
  • 0 kudos

Connect to PostgreSQL to Databricks community edition error

Dear Team,I am trying to establish a connectivity to PostgreSQL to Databricks community edition using sql notebookhowever I am encountering the below errorError in SQL statement: IllegalArgumentException: requirement failed: Host name should not cont...

  • 1313 Views
  • 1 replies
  • 0 kudos
Latest Reply
db-avengers2rul
Contributor II
  • 0 kudos

@Teamany suggestions ?

  • 0 kudos
akj2784
by New Contributor II
  • 20583 Views
  • 11 replies
  • 1 kudos

How to connect PostgreSQL from Databricks

I am trying to connect PostgreSQL from Azure Databricks. I am using the below code to connect. jdbcHostname = "Test" jdbcPort = 1234 jdbcDatabase = "Test1" jdbcUrl = "jdbc:postgresql://{0}:{1}/{2}".format(jdbcHostname, jdbcPort, jdbcDatabase) Conn...

  • 20583 Views
  • 11 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Javier De La Torre do you really need two-way SSL (verify-full)? In most cases one way SSL (sslmode=require) should be enough. @akj2784​  When you say "Connection was successful", where do you mean you established a successful connection? You might...

  • 1 kudos
10 More Replies
longcao
by New Contributor III
  • 14994 Views
  • 5 replies
  • 0 kudos

Resolved! Writing DataFrame to PostgreSQL via JDBC extremely slow (Spark 1.6.1)

Hi there,I'm just getting started with Spark and I've got a moderately sized DataFrame created from collating CSVs in S3 (88 columns, 860k rows) that seems to be taking an unreasonable amount of time to insert (using SaveMode.Append) into Postgres. I...

  • 14994 Views
  • 5 replies
  • 0 kudos
Latest Reply
longcao
New Contributor III
  • 0 kudos

In case anyone was curious how I worked around this, I ended up dropping down to Postgres JDBC and using CopyManager to COPY rows in directly from Spark: https://gist.github.com/longcao/bb61f1798ccbbfa4a0d7b76e49982f84

  • 0 kudos
4 More Replies
Labels