home Homebuild Toolsbug_report Errorsmenu_book Guideslightbulb Tipssmart_toy Promptsextension Extensionsfolder_open Resourcesinfo About
search
error

System.LimitException: Too many query rows: 50001

What does this error mean?

Salesforce caps the total number of records a single transaction can retrieve across all SOQL queries at 50,000 rows. This is a hard per-transaction limit — it applies to the sum of all records returned by every query in that transaction, not just a single query. Once the running total crosses 50,000, Salesforce throws this uncatchable LimitException and halts execution immediately.

Asynchronous contexts (Batch Apex, Queueable, Future, Scheduled) share this same 50,000-row limit per execution context. However, Batch Apex's Database.getQueryLocator() in the start() method can process up to 50 million records because the framework paginates internally — the limit applies only to individual execute() calls.

Common Causes

1. Broad queries without WHERE filters

A query such as [SELECT Id FROM Contact] on a large org retrieves every contact record in one shot. Without a WHERE clause or LIMIT, any object with more than 50,000 records will immediately breach the limit.

2. Multiple queries accumulating rows

Even individually selective queries can combine to exceed the limit. A trigger that fires on 200 accounts and queries 300 related contacts each adds up to 60,000 rows — over the limit even though each individual query is small.

3. Processing large datasets synchronously

Batch jobs, data-migration scripts, or reporting logic that needs thousands of records is better suited for Batch Apex than synchronous Apex, which has no mechanism to page through large result sets.

How to Fix It

Solution 1: Add selective WHERE filters and LIMIT

Always filter your queries to the smallest possible result set. Use indexed fields in WHERE clauses and add a LIMIT as a safety ceiling.

Apex
// ❌ BAD — retrieves all records
List<Contact> contacts = [SELECT Id, Name FROM Contact];

// ✅ GOOD — scoped to relevant subset
List<Contact> contacts = [
    SELECT Id, Name
    FROM Contact
    WHERE AccountId IN :accountIds
      AND CreatedDate = LAST_N_DAYS:30
    LIMIT 10000
];

Solution 2: Use Batch Apex for large datasets

Move large-data processing into a Batch Apex class. The start() method uses Database.getQueryLocator() which supports up to 50 million records, and each execute() chunk gets a fresh set of governor limits.

Apex
global class ContactBatch implements Database.Batchable<SObject> {

    global Database.QueryLocator start(Database.BatchableContext bc) {
        // Up to 50M records — no row-limit risk here
        return Database.getQueryLocator(
            'SELECT Id, Name FROM Contact WHERE IsActive__c = true'
        );
    }

    global void execute(Database.BatchableContext bc, List<Contact> scope) {
        // Each chunk has fresh limits — safe to process
        for (Contact c : scope) { /* process */ }
    }

    global void finish(Database.BatchableContext bc) {{ }}
}

// Kick off with 200 records per chunk (default)
Database.executeBatch(new ContactBatch(), 200);

Solution 3: Use SOQL for-loops to avoid heap accumulation

SOQL for-loops process records in 200-record chunks without ever holding all records in memory at once, helping avoid both the row-limit and heap-size errors simultaneously.

Apex
// ✅ Processes in 200-row chunks — avoids accumulating 50k rows
for (Contact c : [SELECT Id, Name FROM Contact WHERE IsActive__c = true]) {{
    // Each iteration is one record, no large list in memory
}}
lightbulb

Pro Tip: Use Limits.getQueryRows() and Limits.getLimitQueryRows() in debug logs to monitor cumulative row consumption across a transaction before you hit the ceiling.