Read and saving Blob data from oracle to databricks S3 is slow
I am trying to import a table from oracle which has around 1.3 mill rows and one of the column is a Blob, the total size of data on oracle is around 250+ GB. read and save to S3 as delta table is taking around 60 min. I tried with parallel(200 thread...
- 3407 Views
- 4 replies
- 4 kudos
Latest Reply
Hello @Rama Krishna N​ - We will need to check the task on the Spark UI to validate if the operation is a read from oracle database or write into S3. The task should show the specific operation on the UI.Also, the active threads on the Spark UI will ...
- 4 kudos