System.LimitException: Apex heap size too large: 6291457
What does this error mean?
The Apex heap is the portion of memory allocated to your transaction for storing variables, collections, objects, and strings during execution. When the total memory consumed by all live objects in your transaction exceeds the limit, Salesforce throws this exception and rolls back all work.
Unlike CPU or SOQL limits, heap consumption is cumulative — every object you instantiate and every string you build stays in memory until it goes out of scope or the transaction ends.
Common Causes
1. Querying Too Many Fields on Too Many Records
The single biggest contributor to heap bloat. Every field you select in a SOQL query is loaded into memory for every record returned. A query selecting 50 fields on 5,000 records can easily exhaust 6 MB.
// ❌ BAD: SELECT * equivalent — loads every field into heap
List<Account> accounts = [
SELECT Id, Name, Phone, Fax, Website, Industry,
AnnualRevenue, NumberOfEmployees, Description,
BillingStreet, BillingCity, BillingState,
ShippingStreet, ShippingCity, ShippingState
/* 40 more fields... */
FROM Account
LIMIT 5000
];
2. Storing Entire Query Results When Only IDs Are Needed
Keeping a full List<SObject> in memory to iterate IDs is wasteful. Storing the full record means all its fields occupy heap space even if you only ever access one or two.
3. Large JSON Serialization
Calling JSON.serialize() on a large collection creates a new String object in heap that can be just as large as the original collection — effectively doubling the memory cost temporarily.
4. Building Large Strings Incrementally
Concatenating strings in a loop keeps intermediate String objects alive in the heap. A loop that builds a 4 MB string incrementally may peak at well over 6 MB due to intermediate allocations.
5. Retaining References to Large Collections Across Methods
Holding static references to large collections (e.g., in a static map cache) means those objects are never garbage-collected during the transaction, even after the method that created them returns.
| Object / Operation | Approx. Heap Cost | Risk |
|---|---|---|
| SObject with 10 fields selected | ~300–500 bytes each | Medium |
| SObject with 50 fields selected | ~2–4 KB each | High |
JSON.serialize() of 1,000 records |
1–5 MB | High |
| String built in a loop (10k iterations) | Variable, often 2–8 MB peak | High |
Set<Id> with 10,000 entries |
~400 KB | Low |
Map<Id, SObject> with 1,000 entries |
~300 KB–4 MB (depends on fields) | Medium |
How to Fix It
Solution 1: Select Only the Fields You Need
This is the highest-impact fix. Audit every SOQL query and remove fields that are not used in the logic that follows. Even saving 5 fields per record across 5,000 records can reclaim megabytes of heap.
// ✅ GOOD: Only select what the logic actually uses
List<Account> accounts = [
SELECT Id, Name, Industry
FROM Account
WHERE IsActive__c = true
LIMIT 5000
];
// If you only need IDs for a Set, don't load full SObjects
Set<Id> accountIds = new Map<Id, Account>([
SELECT Id FROM Account WHERE IsActive__c = true
]).keySet();
Solution 2: Use transient Variables in Visualforce / Aura
In Visualforce controllers and Aura components, mark large intermediate variables as transient. Transient variables are excluded from the view state and are not serialized to heap between requests.
public class AccountController {
// ✅ transient — not stored in view state between postbacks
transient public List<Account> largeAccountList { get; set; }
// Persisted in view state — keep this small
public String selectedAccountId { get; set; }
public void loadAccounts() {
// Re-queried on every postback — transient handles this
largeAccountList = [
SELECT Id, Name, Industry
FROM Account
LIMIT 1000
];
}
}
Solution 3: Process Records in Chunks with SOQL for Loops
A SOQL for loop retrieves records in internal chunks of 200 and releases each chunk from heap after processing. This prevents you from loading the full result set into memory at once.
// ❌ Loads ALL 50,000 records into heap at once
List<Contact> all = [SELECT Id, Email FROM Contact];
for (Contact c : all) { process(c); }
// ✅ Streams 200 records at a time — heap stays flat
for (List<Contact> chunk : [SELECT Id, Email FROM Contact]) {
processChunk(chunk); // chunk released after each iteration
}
Solution 4: Null Out Large References When Done
Apex's garbage collector runs opportunistically. Setting large collections to null after they are no longer needed makes them eligible for collection sooner, freeing heap for subsequent operations.
List<Account> accounts = [SELECT Id, Name FROM Account LIMIT 5000];
Set<Id> accountIds = buildIdSet(accounts);
// Done with the full records — release them
accounts = null;
// Continue processing using only the lightweight ID set
List<Contact> contacts = [
SELECT Id, Email
FROM Contact
WHERE AccountId IN :accountIds
];
Solution 5: Monitor Heap Consumption
// Check heap at critical points during development
System.debug('Heap used : ' + Limits.getHeapSize() + ' bytes');
System.debug('Heap limit: ' + Limits.getLimitHeapSize() + ' bytes');
System.debug('Used % : '
+ ((Limits.getHeapSize() * 100)
/ Limits.getLimitHeapSize())
+ '%'
);
Pro Tip: In debug logs, search for HEAP_ALLOCATE entries to see exactly which lines are allocating the most memory. The Apex Replay Debugger in VS Code can also show live heap values at each checkpoint, making it much easier to pinpoint the allocating culprit.