- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-25-2021 01:10 AM
Hi everyone,
I am using SSH tunnelling with SSHTunnelForwarder to reach a target AWS RDS PostgreSQL database. The connection got through, however when I tried to display the retrieved data frame it always throws "connection refused" error. Please see screenshot below for clarity. I don't understand how this function works, it seems like it does not reuse the established connection from sparks.read command. Can somebody help me on this?
What I already tried:
- SSH into the cluster, try tunnelling, and psql from there [works]
- directly connect to db instance from proxy host [works]
- Using panda to load the data frame instead of sparks [does not work]
- Create new SSH key pair for Databricks [does not work]
- Change postgresql.conf listen_address to ''*"
Screenshot:
Thanks.
Regards,
Kurnianto
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-12-2021 04:41 PM
hi @Kurnianto Trilaksono Sutjipto ,
This seems like a connectivity issue with the url you are trying to connect to. It fails during the display() command because read is a lazy transformation and it will not be executed right away. On the other hand, display() is an action and it will trigger the lazy transformation.
The following doc will help you to check if you can connect tho this host docs
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-12-2021 04:41 PM
hi @Kurnianto Trilaksono Sutjipto ,
This seems like a connectivity issue with the url you are trying to connect to. It fails during the display() command because read is a lazy transformation and it will not be executed right away. On the other hand, display() is an action and it will trigger the lazy transformation.
The following doc will help you to check if you can connect tho this host docs
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-18-2021 09:53 AM
If you would like to check for connectivity from databricks to your host, please execute the following command in your notebook.
%sh nc -vz <jdbcHostname> <jdbcPort>

