cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Does Databricks offer something like Oracle's dblink?

quakenbush
Contributor

I am aware, I can load anything into a DataFrame using JDBC, that works well from Oracle sources. Is there an equivalent in Spark SQL, so I can combine datasets as well?

Basically something like so - you get the idea...

select
    lt.field1,
    rt.field2
from localTable lt
join remoteTable@serverLink rt
    on rt.id = lt.id

Thanks

1 ACCEPTED SOLUTION

Accepted Solutions

Anonymous
Not applicable

dblink does not exist. What you can do is create two table statements with jdbc sources and then do a join of the two tables. It will be a little more to write, but you'll get the correct table in the end.

In python you can maybe do it easier with something like:

spark.read.jdbc(config1).join(spark.read.jdbc(config2), "key", "type")

View solution in original post

3 REPLIES 3

Anonymous
Not applicable

dblink does not exist. What you can do is create two table statements with jdbc sources and then do a join of the two tables. It will be a little more to write, but you'll get the correct table in the end.

In python you can maybe do it easier with something like:

spark.read.jdbc(config1).join(spark.read.jdbc(config2), "key", "type")

AdrianLobacz
Contributor

quakenbush
Contributor

Thanks everyone for helping.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group