04.05.02 — Testing Bulk Operations and Data Volume
Lesson goal
This lesson explains how QA should test Salesforce at scale, not just functionally.
After completing this lesson, you should be able to:
- design bulk operation test cases (imports, mass updates, deletions)
- expose non-bulkified automation and performance bottlenecks
- understand data volume and ownership skew risks
- validate long-running asynchronous jobs safely
This lesson continues the Performance & Scalability section of Module 4.
The problem: one record proves nothing
A Salesforce solution that works for one record may completely fail for:
- 200 records in a single transaction
- 50,000 records processed asynchronously
- millions of records already stored in the org
Bulk Operations and high Data Volume expose:
- non-bulkified automation
- inefficient queries
- locking and sharing recalculation issues
From a QA perspective:
Scalability is not assumed — it must be proven.
Pillar 1: bulk DML operations
Bulk DML testing validates whether Flows, Apex Triggers, and other automation can handle multiple records in a single transaction.
Batch maximum load test
This is the baseline bulkification test.
Action
- Use Data Loader
- Update 200 records in a single operation
- Modify a field that triggers the most complex automation on the object
(for example: Opportunity Stage change)
Expected results
- Pass: all 200 records are processed successfully
- Fail: any
System.LimitException(SOQL, DML, CPU)
Any limit exception here is a critical defect.
It guarantees production failure during normal business activity.
Concurrent bulk operations
Real systems are not used sequentially.
Action
- User A performs a bulk Account update
- User B performs a bulk Lead conversion at the same time
QA focus
- database locking
- record save failures
- unexpected partial updates
Failures here often indicate:
- row locking issues
- unhandled sharing recalculations
- poor transaction design
Pillar 2: data volume and ownership skew
Performance issues are not caused only by batch size — total data volume and ownership distribution matter just as much.
High-volume query testing
Objects like Task, Event, and custom logging objects can reach millions of records.
Action
- Run a standard report or List View on an object with very high record count
- Include filters, grouping, or formulas if used in production
Expected result
- report completes within business-acceptable time
- no query timeout
- no unusable UI lag
If a report works only in empty sandboxes, it is not production-ready.
Ownership skew testing
Ownership Skew occurs when a single user owns a disproportionate number of records.
Example
- one integration user owns millions of Tasks
- one system user owns all imported data
Action
- Update or delete records owned by the skewed user
- Trigger sharing recalculation or automation
Expected result
- operation completes without excessive delay
- no locking failures
- no cascading performance degradation
Ownership skew is one of the most common hidden production killers.
Pillar 3: long-running asynchronous jobs
Some processes are designed to exceed synchronous limits and must run asynchronously.
Batch Apex integrity test
Action
- Trigger a Batch Apex job processing a known large volume
(for example: 50,000 records)
QA verification
- job creates the expected number of batches
- each batch completes successfully
- no batch fails due to CPU or query limits
Expected result
- total number of processed records matches input exactly
- no silent failures or skipped batches
Partial success in Batch Apex is often worse than total failure — it creates silent data corruption.
Practical QA strategies
Synthetic data generation
Empty sandboxes hide performance problems.
QA must be able to generate:
- large data volumes
- realistic parent–child relationships
- ownership skew scenarios
Use:
- Data Loader
- Data Mask
- temporary Apex or Flow utilities (in non-prod orgs)
Performance tests without realistic data are meaningless.
Prioritize high-risk bulk scenarios
When time is limited, always prioritize:
- Lead Conversion
Multi-object DML, high automation density - External Integrations
Payload size and batch limits - Complex Reports
Cross-object filters, formulas, and summaries
These are the first features to fail under load.
Summary: scalability is proven by volume
Bulk Operations and Data Volume testing move QA from:
- “does it work?”
to:
- “will it survive production?”
By aggressively testing:
- 200-record bulk transactions
- high data volume queries
- ownership skew
- long-running asynchronous jobs
QA ensures the Salesforce implementation is not only correct — but scalable, stable, and future-proof.
If you never test bulk operations, production users will — at the worst possible moment.