โ05-09-2015 02:35 PM
โ05-09-2015 02:38 PM
If you see the following error without SSL:
java.sql.SQLException: [Amazon](500150) Error setting/closing connection: no pg_hba.conf entry for host "x.x.x.x", user "user", database "database", SSL off.
then the following error with SSL enabled:
java.sql.SQLException: [Amazon](500150) Error setting/closing connection: General SSLEngine problem.
You can try appending
sslfactory=org.postgresql.ssl.NonValidatingFactory
to the connection URL to fix the problem.
โ08-25-2016 10:04 AM
I am having the same issue. That did not work for me I am still getting the same error. Did you get your issue resolved?
โ06-14-2017 08:21 AM
I also have the same issue. The suggested solution did not work. Also as a side note, everything had been working until last Friday...
โ06-14-2017 02:09 PM
sample = (spark.read
.format("com.databricks.spark.redshift")
.option("url", jdbcUrl)
.option("dbtable", "xx.xxx") # schema, table
.option("forward_spark_s3_credentials", True)
.option("tempdir", tempDir)
.option("autoenablessl", "false") # disable SSL.
.load())
The above suggested solution didn't work for me, but disabling SSL did. I wish there was a better solution.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group