Trade-offs
Every data architecture decision involves trade-offs. A CTA does not just pick the “right” answer — they articulate what they gained and what they gave up. This page documents the critical trade-offs a CTA must be able to discuss fluently during the review board.
Normalization vs Denormalization
This is the foundational tension in any data model. Salesforce’s platform constraints shift the traditional balance.
The Trade-off
graph LR
A[Normalization] <-->|Tension| B[Denormalization]
A --> A1[Data integrity]
A --> A2[No redundancy]
A --> A3[Smaller records]
A --> A4[More SOQL joins]
A --> A5[Complex report types]
B --> B1[Faster queries]
B --> B2[Simpler reports]
B --> B3[Data redundancy risk]
B --> B4[Larger records]
B --> B5[Sync triggers needed]
| Dimension | Normalized | Denormalized |
|---|---|---|
| Data integrity | High — single source of truth | Risk — must keep copies in sync |
| Query complexity | Higher — relationship queries, subqueries | Lower — single object queries |
| SOQL limits | More relationship queries consumed | Fewer queries needed |
| Report design | Complex report types across objects | Simpler single-object reports |
| Storage | Efficient — no duplicate data | Wasteful — duplicate data stored |
| Maintenance | Easier — update in one place | Harder — update in multiple places |
| Performance at scale | May degrade with many joins on LDV | Better read performance on LDV |
| Field count | Spread across objects | Risk hitting 800-field limit |
When to Normalize on Salesforce
- The entity is referenced from multiple parent objects
- Data changes frequently and must be consistent everywhere
- You are not hitting SOQL relationship query limits
- The entity has its own security and sharing requirements
When to Denormalize on Salesforce
- Read performance is critical and the data is queried together 90%+ of the time
- The denormalized data rarely changes (reference data, configuration)
- You are hitting SOQL relationship query limits in critical paths
- Reporting requirements demand single-object simplicity
- Formula fields can bridge the gap (calculated denormalization without sync risk)
CTA presentation framing
“I chose to denormalize the product pricing data onto the Quote Line Item because it is queried on every CPQ page load, rarely changes, and eliminating the cross-object query reduced page load time from 3.2s to 0.8s. The trade-off is that price changes require a batch job to propagate, which runs nightly.”
On-Platform vs External Data
Where data lives determines cost, performance, capabilities, and complexity.
The Trade-off
| Dimension | On-Platform (Salesforce) | External (Connect / Data Lake) |
|---|---|---|
| Query speed | Sub-second (local data) | 1-5 seconds (network roundtrip) |
| SOQL support | Full SOQL, SOSL, reports | Limited SOQL subset |
| Automation | Full trigger/flow/workflow support | Minimal (async triggers only) |
| Storage cost | Salesforce data storage (expensive) | External storage (cheaper) |
| Data freshness | As fresh as last sync | Real-time (live query) |
| Availability | Independent of external systems | Dependent on external system uptime |
| Reporting | Full reporting and dashboards | Limited or no reporting |
| Global search | Included | Not included |
| Mobile offline | Available | Not available |
Decision Heuristic
- If users need to search, report, trigger, or workflow on the data — bring it on-platform
- If the data is large, read-only, and referenced occasionally — keep it external
- If the data changes rapidly in the external system and you need current values — virtualize with Connect
- If the data is the system of record in Salesforce — it must be on-platform
Cost is often the real driver
Salesforce data storage is significantly more expensive per GB than cloud storage (S3, Azure Blob, GCS). For large datasets with limited Salesforce interaction, the cost argument alone can justify external storage. Quantify this for the review board.
Big Bang vs Phased Migration
The cutover strategy affects risk, timeline, cost, and organizational stress.
The Trade-off
| Dimension | Big Bang | Phased |
|---|---|---|
| Risk | High — all-or-nothing | Lower — incremental |
| Downtime | One concentrated window | Multiple smaller windows |
| Complexity | Lower (single event) | Higher (coordination, data split) |
| Duration | Short wall-clock time | Long wall-clock time |
| Rollback | Clear rollback point | Complex partial rollback |
| User impact | One big change | Gradual adaptation |
| Data consistency | Consistent at cutover point | Temporarily split across systems |
| Cost | Lower (one event) | Higher (multiple events, dual maintenance) |
| Team stress | Very high during cutover | Sustained moderate stress |
When Big Bang Works
- Total data volume fits within an available downtime window
- Few source systems with clear data boundaries
- Organization can tolerate a single high-risk event
- Rollback is straightforward (restore from backup)
When Phased Is Required
- Data volume exceeds what can be loaded in any available downtime window
- Multiple business units with different readiness timelines
- High-risk, compliance-sensitive data that needs careful validation
- Organization cannot tolerate extended downtime
The Parallel Run Option
A parallel run (both systems live simultaneously) mitigates risk further but at significant cost:
- Dual data entry — Users must enter data in both systems
- Reconciliation — Results must match between systems
- Duration — Typically 2-4 weeks of parallel operation
- Best for — Financial systems, regulatory systems, or any system where data errors have legal consequences
Standard vs Custom Objects
This trade-off appears in almost every CTA scenario.
The Trade-off
| Dimension | Standard Object | Custom Object |
|---|---|---|
| Time to value | Fast — features built in | Slower — build everything |
| Platform integration | Deep — Einstein, forecasting, etc. | Shallow — must configure |
| AppExchange | Compatible by default | May need custom mapping |
| Flexibility | Constrained by existing schema | Full control |
| User familiarity | Users know “Accounts” and “Cases” | Requires training |
| Naming | May not match business language | Can match exactly |
| Sharing defaults | May have complex pre-set sharing | Clean slate |
| Future-proofing | Salesforce enhances standard objects | You maintain custom objects |
The Hidden Costs
Standard object hidden costs:
- Fighting default behaviors that don’t match your process
- Explaining to users why the object is named differently than their business concept
- Managing field-level security on fields you don’t need but can’t remove
- Dealing with standard object quirks in Apex (e.g., Person Account exceptions)
Custom object hidden costs:
- Rebuilding features that come free with standard objects (reporting, mobile layouts)
- Maintaining custom objects through Salesforce releases
- Missing out on future Salesforce innovations that target standard objects
- AppExchange integration mapping and maintenance
The 80/20 rule
If a standard object covers 80% of your requirements, use it and customize the remaining 20%. If it covers less than 50%, the cost of fighting the standard object exceeds the cost of building custom.
Lookup vs Master-Detail (Trade-off View)
Beyond the decision flowchart in Decision Guides, here is the trade-off lens.
What You Gain and Lose
| You gain with Master-Detail | You lose with Master-Detail |
|---|---|
| Native roll-up summaries | Independent child ownership |
| Sharing inheritance (simpler security) | Flexible reparenting |
| Cascade delete (cleanup) | Optional parent field |
| Tight parent-child coupling | Child record autonomy |
| You gain with Lookup | You lose with Lookup |
|---|---|
| Independent sharing model | Must build roll-ups manually |
| Optional relationship | No cascade delete |
| Easy reparenting | No sharing inheritance |
| Child has its own owner | More complex security design |
The Conversion Risk
Converting between relationship types after data exists:
- Lookup to Master-Detail: Requires all lookup field values to be non-null. Must check every record. If any are null, you must populate them first or delete those records.
- Master-Detail to Lookup: Can be done immediately but loses all roll-up summaries, sharing inheritance, and cascade delete. All child records get their own OwnerId assigned.
Plan relationship types during design
Converting relationship types in production is operationally expensive and risky. The CTA should invest time during design to get this right. If in doubt, start with master-detail (more restrictive, harder to add later) and relax to lookup only if independence is truly needed.
How to Present Trade-offs on the CTA Exam
The review board is not looking for perfect answers — they are looking for architects who understand consequences. Use this structure:
- State the decision: “I chose X over Y”
- Explain the gain: “This gives us…”
- Acknowledge the cost: “The trade-off is…”
- Justify: “I accepted this trade-off because…”
- Mitigate: “To address the downside, I designed…”
This pattern works for every trade-off on this page and demonstrates the architectural maturity that earns CTA certification.
Sources
- Salesforce Architects: Data 360 Architecture
- Salesforce Help: Object Relationships Overview
- Salesforce Help: Best Practices When You Migrate Data
- Salesforce Help: External Objects in Salesforce Connect
- CTA Study Guide: Data Domain
- Martin Fowler: Patterns of Enterprise Application Architecture (normalization principles)