Unity Catalog
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-04-2024 01:01 AM
When I try to connect my local postgres with databricks unity catalog I am facing issues. Could you please explain the steps in doing that
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-04-2024 01:46 AM
Hi,
You can take the help of the documentation shared below,
https://learn.microsoft.com/en-us/azure/databricks/query-federation/postgresql
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-04-2024 02:17 AM
CREATE CONNECTION <connection-name> TYPE postgresql
OPTIONS (
host '<hostname>',
port '<port>',
user '<user>',
password '<password>'
);
When I am running the above code mentioned in the documentation I am getting errors
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-04-2024 02:19 AM
CREATE CONNECTION <connection-name> TYPE postgresql
OPTIONS (
host '<hostname>',
port '<port>',
user '<user>',
password '<password>'
);
When I am running the above code mentioned in the documentation I am getting errors
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-04-2024 02:28 AM
Again: is your local network and the database open for external connections?
Do you have a static ip (or dynDNS)?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-04-2024 02:45 AM - edited 12-04-2024 02:49 AM
yes I have opened the database for external connections and its static.
driver = "org.postgresql.Driver"
database_host = "<database-host-url>"
database_port = "5432" # update if you use a non-default port
database_name = "<database-name>"
table = "<table-name>"
user = "<username>"
password = "<password>"
url = f"jdbc:postgresql://{database_host}:{database_port}/{database_name}"
remote_table = (spark.read
.format("jdbc")
.option("driver", driver)
.option("url", url)
.option("dbtable", table)
.option("user", user)
.option("password", password)
.load()
)
I have also tried this code also and this also has the issues
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-04-2024 09:29 PM
Could you please suggest a solution to the above query.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-04-2024 01:58 AM
Databricks does not have connectivity to your local network out of the box.
You should setup a VNet and VNet peering (and also firewall rules).
Connect your Azure Databricks workspace to your on-premises network - Azure Databricks | Microsoft L...

