cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

java.lang.OutOfMemoryError on Data Ingestion and Storage Pipeline

hv129
New Contributor
I have around 25GBs of data in my Azure storage. I am performing data ingestion using Autoloader in databricks. Below are the steps I am performing:
  1. Setting the enableChangeDataFeed as true.
  2. Reading the complete raw data using readStream.
  3. Writing as delta table using writeStream to the azure blob storage.
  4. Reading the change feed data of this delta table using spark.read.format("delta").option("readChangeFeed", "true")...
  5. Performing operations on the change feed table using withColumn (Performing operations on content column as well, might be taking a lot of computation).

Now I am trying to save this computed pyspark dataframe to my catalog but getting the error: java.lang.OutOfMemoryError. My databricks cluster has 1 Driver with 16GBs of memory and 4 nodes, max of 10 Workers with 16GBs of memory and 4 nodes each.

Is there a need to include more resources to the cluster or is there some way to optimize or replace the current pipeline?

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Thank you for posting your question in our community! We are happy to assist you.

To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?

This will also help other community members who may have similar questions in the future. Thank you for your participation and let us know if you need any further assistance! 
 

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.