cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

The authentication type 10 is not supported

LoiNguyen
New Contributor II

I use below code to connect to postgresql.

df = spark.read \ 
    .jdbc("jdbc:postgresql://hostname:5432/dbname", "schema.table",
          properties={"user": "user", "password": "password"})\
      .load()
df.printSchema()

However, I got the following error

org.postgresql.util.PSQLException: The authentication type 10 is not supported. Check that you have configured the pg_hba.conf file to include the client's IP address or subnet, and that it is using an authentication scheme supported by the driver.

Can anyone help me to solve this problem? Thanks.

5 REPLIES 5

mlapierre
New Contributor II

Hi, were you able to solve the issue? I just ran into the same problem and haven't found any other discussions on this.

DBXC
Contributor

try the following to check connectivity between Databricks and your DB server

%sh ping servername

%sh ping IP_Of_your_server

If no connectivity, you would need to check your firewall / vNet and DNS mapping as Databricks takes fully qualify DNS instead server name with ports (assuming this is how you access your DB server "locally" in your network)

HaroldAlvarado
New Contributor II

Hey there! I can understand that this types of issue is frustrating for any users. If you are Mac user and still struggling with the authentication problem on this site then you can check this source- https://setapp.com/how-to/how-to-flush-dns-cache that can be helpful to solve this type of problem. In this article they share complete tutorial about how to flush DNS cache on Mac. Good Luck for you.

simboss
New Contributor II

But how are we going to do this for those who use Windows?

AntonioR
New Contributor II

in Windows the command is 

ipconfig /flushdns