I am aware, I can load anything into a DataFrame using JDBC, that works well from Oracle sources. Is there an equivalent in Spark SQL, so I can combine datasets as well?
Basically something like so - you get the idea...
select
lt.field1,
rt.field2
from localTable lt
join remoteTable@serverLink rt
on rt.id = lt.id
Thanks