we can use JDBC driver to write dataframe to Oracle tables.
Download Oracle ojdbc6.jar JDBC Driver
You need an Oracle jdbc driver to connect to the Oracle server. The latest version of the Oracle jdbc driver is ojdbc6.jar file. You can download the driver version as per your JDK version.
You can download this driver from official website. Go ahead and create Oracle account to download if you do not have. Or can download from maven as dependent library in cluster or job directly
In the Databricks Clusters UI, install your third-party library .jar or Maven artifact with Library Source
Upload DBFS, DBFS/S3 or Maven. Alternatively, use the Databricks libraries API.
Load Spark DataFrame to Oracle Table Example
Now the environment is se. we can use dataframe.write method to load dataframe into Oracle tables.
For example, the following piece of code will establish JDBC connection with the Oracle database and copy dataframe content into mentioned table.
Df.write.format('jdbc').options(
url='jdbc:oracle:thin:@192.168.11.100:1521:ORCL',
driver='oracle.jdbc.driver.OracleDriver',
dbtable='testschema.test',
user='testschema',
password='password').mode('append').save()