cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

create an external connection to oracle

austinoyoung
New Contributor III

Hi! I've been trying to create an external connection to oracle but getting the following error message "Detailed error message: ORA-00604: error occurred at recursive SQL level 1 ORA-01882: timezone region not found" I searched online and found some info about using an older version of jdbc driver oracle:ojdbc6:11.1.0.7.0, but I got not found error during library installation. Has anyone successfully solved this kind of issue before? Thanks! 

5 REPLIES 5

TheOC
New Contributor II

hey @austinoyoung ,
I don't have an Oracle database to be able to test this for you, but I believe you can get around this error by following the steps laid out in here:
https://stackoverflow.com/questions/9156379/ora-01882-timezone-region-not-found

In short, I understand the fix to be adding the parameter:

oracle.jdbc.timezoneAsRegion=false


To your connection. Assuming your connection is set up similar to how it is laid out in this article, I believe the additional parameter can be added to the url, as such:

df.write.format('jdbc').options(
    url='jdbc:oracle:thin:@192.168.11.100:1521:ORCL?oracle.jdbc.timezoneAsRegion=false',
    driver='oracle.jdbc.driver.OracleDriver',
    dbtable='testschema.test',
    user='testschema',
    password='password'
).mode('append').save()

 Let me know if this doesn't help and I'll have a deeper dive for you ๐Ÿ™‚

 

TheOC

Cheers,
TheOC

austinoyoung
New Contributor III

Thank you, TheOC, for the information! Yes, adding the "parameter" is the fix but I don't seem to be able to find this option in setting up the external connection (the only available options are host, port, username, and password). Yes we have been using the jdbc connection to programmatically connect to oracle but we are just exploring the lakehouse federation way to reading data from oracle, a very old oracle database.

Thanks so much!

TheOC
New Contributor II

Ah - Just so you're aware I've just gone through the same steps as you and yes, I see exactly what you mean.
Unfortunately It appears there's no obvious way to supply additional parameters in the catalogue so I feel we may be at a roadblock at this particular angle. I know programmatically there's nothing preventing you passing in those additional credentials into a query/script - but the UI seems to be a blocker on this one.

Hopefully someone who knows a little more about this side can chime in with a solution. It could also be one for the ideas section:
https://docs.databricks.com/aws/en/resources/ideas as I feel you can't be the only one with this query!

Do you know if you're able to install an older version of the JDBC driver as mentioned previously? I wonder if that would perform differently.

Cheers,
TheOC

austinoyoung
New Contributor III

Thank you, TheOC, for the help! I could not find any older version than 11.2.x of jdbc driver but it doesn't seem to work, even after I added the parameter to spark config of the cluster I use to create the external connection. I guess I have to go back to the coding way to get the data out of the old database. I am not sure if databricks would spend any effort on this issue since it is a very old version of oracle, and they probably would advise to find some 3rd party connector/driver. I will let you know if I find anything. Thanks again for your help! 

TheOC
New Contributor II

No problem at all!
Sorry I couldn't get you the answer - I hope you have good luck in finding another route!

Cheers,
TheOC

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now