4 weeks ago
Hi! I've been trying to create an external connection to oracle but getting the following error message "Detailed error message: ORA-00604: error occurred at recursive SQL level 1 ORA-01882: timezone region not found" I searched online and found some info about using an older version of jdbc driver oracle:ojdbc6:11.1.0.7.0, but I got not found error during library installation. Has anyone successfully solved this kind of issue before? Thanks!
3 weeks ago
hey @austinoyoung ,
I don't have an Oracle database to be able to test this for you, but I believe you can get around this error by following the steps laid out in here:
https://stackoverflow.com/questions/9156379/ora-01882-timezone-region-not-found
In short, I understand the fix to be adding the parameter:
oracle.jdbc.timezoneAsRegion=false
To your connection. Assuming your connection is set up similar to how it is laid out in this article, I believe the additional parameter can be added to the url, as such:
df.write.format('jdbc').options(
url='jdbc:oracle:thin:@192.168.11.100:1521:ORCL?oracle.jdbc.timezoneAsRegion=false',
driver='oracle.jdbc.driver.OracleDriver',
dbtable='testschema.test',
user='testschema',
password='password'
).mode('append').save()
Let me know if this doesn't help and I'll have a deeper dive for you ๐
TheOC
3 weeks ago
Thank you, TheOC, for the information! Yes, adding the "parameter" is the fix but I don't seem to be able to find this option in setting up the external connection (the only available options are host, port, username, and password). Yes we have been using the jdbc connection to programmatically connect to oracle but we are just exploring the lakehouse federation way to reading data from oracle, a very old oracle database.
Thanks so much!
3 weeks ago
Ah - Just so you're aware I've just gone through the same steps as you and yes, I see exactly what you mean.
Unfortunately It appears there's no obvious way to supply additional parameters in the catalogue so I feel we may be at a roadblock at this particular angle. I know programmatically there's nothing preventing you passing in those additional credentials into a query/script - but the UI seems to be a blocker on this one.
Hopefully someone who knows a little more about this side can chime in with a solution. It could also be one for the ideas section:
https://docs.databricks.com/aws/en/resources/ideas as I feel you can't be the only one with this query!
Do you know if you're able to install an older version of the JDBC driver as mentioned previously? I wonder if that would perform differently.
3 weeks ago
Thank you, TheOC, for the help! I could not find any older version than 11.2.x of jdbc driver but it doesn't seem to work, even after I added the parameter to spark config of the cluster I use to create the external connection. I guess I have to go back to the coding way to get the data out of the old database. I am not sure if databricks would spend any effort on this issue since it is a very old version of oracle, and they probably would advise to find some 3rd party connector/driver. I will let you know if I find anything. Thanks again for your help!
3 weeks ago
No problem at all!
Sorry I couldn't get you the answer - I hope you have good luck in finding another route!
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now