Laptops to Schools (L2S)
AI-Assisted Study Note
This page brings together public scenario links and AI-assisted research notes for study use. Start with the scenario brief, make your own attempt, and open the spoiler section only when you are ready to compare.
Scenario Snapshot
| Field | Detail |
|---|---|
| Start here | Scenario brief PDF |
| Scenario source | Official or official-adjacent scenario |
| Current status | Official Practice (Live) |
| First public date | 2017-04 |
| Primary source | Open primary source |
| Coverage available | Scenario brief + Discussion or analysis |
Why This Scenario Matters
- This entry is included because it appears in the public CTA scenario corpus and has enough public evidence to track for study use.
Only Open If You Have Attempted the Scenario
The section below contains public follow-up links, board-call material, and AI-assisted notes compiled from those public sources.
Open follow-up links, Q&A, and analysis
Follow-Up Links
Board Insights & Common Pitfalls
Generalized Judge Questions
- Massive Inbound Volume: “You have 1.5 million laptops arriving monthly. How will you manage storage limits over 3 years without hitting the 10GB/20GB cap in weeks?”
- Reporting on Archives: “If you move laptop data to Big Objects or Heroku, how will the regional Donation Coordinators run their performance reports?”
- Sync for Bulk Data: “Requirement 5.3 asks for ‘real-time’ monetary updates. If 10,000 laptops are processed, how do you prevent a total integration timeout?”
- PII Hard-Silos: “How do you ensure laptop records containing PII are restricted to only the Security Team, and how is the record ‘handed back’ once PII is removed?”
- Volume-Based Assignment: “Standard assignment rules are criteria-based. How do you implement load balancing based on the current volume of records assigned to an agent?”
Common Mistakes
- Standard Object Trap: Attempting to store every donated laptop as a standard Asset record without a robust off-platform archiving strategy.
- Ignoring Government Web Service: Failing to explain how to handle the external monetary value lookup (Req 5.2) within the bulk integration flow.
- Large File Misuse: Suggesting Salesforce host the 20 software applications for download. Salesforce is not a file server; external storage (S3) via Files Connect is expected.
- Round-Robin Confusion: Assuming standard assignment rules can handle “skill-based” or “load-balanced” routing for expert student forums.
Strong Patterns
- Heroku Postgres Virtualization: Using Heroku to store the 1.5M monthly records and surfacing them in Salesforce via External Objects to keep the core org lean.
- Omni-Channel Skill-Routing: Leveraging Omni-Channel for the student forum to ensure chats reach experts with the correct technical skillset.
- ESB CSV Transformation: Using MuleSoft to handle the inconsistent CSV formats from 5,000 different corporations before uploading to Salesforce.
Strategic Insights
- The “Large Data Volume” Benchmark: L2S is the definitive scenario for testing an architect’s ability to handle high-frequency, massive-scale record creation.
- PII Compliance: Success hinges on a clear “Data Lifecycle” for sensitive PII, including hard-gates for record access during the scrubbing process.
Additional Notes
- Focuses on NGO logistics, high-volume donation processing, and strict PII security requirements.
This is a personal study site for Salesforce CTA exam preparation. Built with AI assistance. Not affiliated with Salesforce.