cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Photon running out of memory error while executing SQL on SQL Serverless warehouse

vijamit
New Contributor

Photon ran out of memory while executing this query. Photon failed to reserve 32.0 KiB for BufferPool, in BroadcastBufferedRelation(spark_plan_id=13407). Memory usage: Total task memory (including non-Photon): 1829.7 MiB BroadcastBufferedRelation(spark_plan_id=13407): allocated 1701.7 MiB, tracked 1701.7 MiB, untracked allocated 0.0 B, peak 1701.7 MiB BufferPool: allocated 1701.7 MiB, tracked 1701.7 MiB, untracked allocated 0.0 B, peak 1701.7 MiB AppendableMemoryPool: allocated 0.0 B, tracked 0.0 B, untracked allocated 0.0 B, peak 0.0 B Memory consumers: Acquired by com.databricks.photon.NativeMemoryConsumer@2a146bb: 1829.7 MiB

2 REPLIES 2

Khaja_Zaffer
Contributor

Hello @vijamit 

How can we help you?

Khaja_Zaffer
Contributor

Hello @vijamit 

Good day!!

It was very hard to analyise the error but : 

The causes for the error shared are : Your query has run out of memory during execution, specifically when using the BuildHashedRelation and PartitionedRelation functions.

Running out of memory happens when memory is improperly allocated during query execution. The Photon cluster relies on accurate table statistics to optimize query execution and manage memory usage. When the statistics are incorrect, Photon may allocate insufficient memory for the query, resulting in an Out of Memory error.

Additionally, memory management issues can occur when: 

  • Queries have multiple joins, subqueries, or aggregations, which increases the complexity of memory management. This makes it more challenging for Photon to accurately estimate memory needs.
  • Youโ€™re working with large datasets, which increases the likelihood of encountering an out-of-memory error. Photon may underestimate the memory required to process the data.
  • You work with dependencies such as outdated libraries or incompatible versions, which also contribute to memory management problems in Photon.

Expected solution: 

  • Ensure that all tables involved in the query have up-to-date statistics. Execute ANALYZE TABLE COMPUTE STATISTICS on each table to recompute and update the statistics.

ANALYZE TABLE <table-name> COMPUTE STATISTICS;
  • If possible, simplify complex queries by breaking them down into smaller, more manageable parts. This can help Photon better estimate memory requirements and reduce the likelihood of an out-of-memory error.

  • Upgrade to Databricks Runtime 13.3 LTS or above. There is a new feature added to Databricks Runtime versions starting with 13.3 LTS that helps mitigate this issue. 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now