- 6266 Views
- 3 replies
- 2 kudos
Hi,We have two workspaces on Databricks, prod and dev. On prod, if we create a new all-purpose cluster through the web interface and go to Environment in the the spark UI, the spark.master setting is correctly set to be the host IP. This results in a...
- 6266 Views
- 3 replies
- 2 kudos
Latest Reply
I found the same issue when choosing the default cluster setup on first setup that when I went to edit the cluster to add an instance profile, I was not able to save without fixing this. Thanks for the tip
2 More Replies
by
Cano
• New Contributor III
- 14355 Views
- 15 replies
- 0 kudos
I am trying to connect my Spark cluster to a Postgresql RDS instance. The Python notebook code that was used is seen below:df = ( spark.read \
.format("jdbc") \
.option("url", "jdbc:postgresql://<connection-string>:5432/database”)\
.option("dbt...
- 14355 Views
- 15 replies
- 0 kudos
Latest Reply
"Caused by: java.net.SocketTimeoutException: connect timed out" indicate the network connection between Databricks cluster and the postgress database on 5432 port was not established and eventually timed out.As a first step, please ensure the connect...
14 More Replies