SSL connection java.sql.SQLException with Redshift
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-09-2015 02:35 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-09-2015 02:38 PM
If you see the following error without SSL:
java.sql.SQLException: [Amazon](500150) Error setting/closing connection: no pg_hba.conf entry for host "x.x.x.x", user "user", database "database", SSL off.
then the following error with SSL enabled:
java.sql.SQLException: [Amazon](500150) Error setting/closing connection: General SSLEngine problem.
You can try appending
sslfactory=org.postgresql.ssl.NonValidatingFactory
to the connection URL to fix the problem.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-25-2016 10:04 AM
I am having the same issue. That did not work for me I am still getting the same error. Did you get your issue resolved?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-14-2017 08:21 AM
I also have the same issue. The suggested solution did not work. Also as a side note, everything had been working until last Friday...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-14-2017 02:09 PM
sample = (spark.read
.format("com.databricks.spark.redshift")
.option("url", jdbcUrl)
.option("dbtable", "xx.xxx") # schema, table
.option("forward_spark_s3_credentials", True)
.option("tempdir", tempDir)
.option("autoenablessl", "false") # disable SSL.
.load())
The above suggested solution didn't work for me, but disabling SSL did. I wish there was a better solution.

