Skip to content

Domain Grilling: D1 System Architecture

AI-Generated Content — Use for Reference Only

This content is AI-generated and has only been validated by AI review processes. It has NOT been reviewed or validated by certified Salesforce CTAs or human subject matter experts. Do not rely on this content as authoritative or completely accurate. Use it solely as a reference point for your own study and preparation. Always verify architectural recommendations against official Salesforce documentation.

System Architecture is the broadest CTA domain, covering org strategy, licensing, mobile, reporting, document management, and platform capabilities. Judges probe for your ability to justify major structural decisions — org strategy alone can make or break a board attempt.

Type 1: Invalid — “Your Solution Won’t Work”

These questions challenge a flaw in your design. The judge believes your approach is technically incorrect or impossible.

Q1.1: Salesforce Mobile App with Offline Large File Attachments

Judge: “You said you’d use the Salesforce Mobile App for field technicians who need offline access with large file attachments. But Briefcase has a 50,000 record limit across all briefcases and limited offline file support. Will that actually work for your 200 technicians downloading 50 MB inspection reports?”

What they’re testing: Whether you understand Briefcase limitations and the boundary between standard mobile and SDK-based solutions.

Model answer: “You’re right — Briefcase is designed for record-level offline access, not heavy file synchronization. With 200 technicians each needing large inspection reports offline, Briefcase’s 50,000-record org-wide limit and limited file handling would not meet the requirement. I would revise this to use Field Service Mobile for the technician persona, which has purpose-built offline capabilities including offline knowledge articles and work order attachments. If the file requirements exceed even Field Service Mobile’s capacity, I would evaluate the Mobile SDK with SmartStore for encrypted local storage and MobileSync for bidirectional sync, accepting the higher development and maintenance cost.”


Q1.2: Single Org with Regulatory Data Residency

Judge: “You proposed a single org, but one business unit operates in the EU under strict GDPR data residency requirements and the other is US-based. How do you prevent EU personal data from being processed in US infrastructure?”

What they’re testing: Whether you understand that Hyperforce solves data-at-rest residency but not all processing scenarios.

Model answer: “That’s a critical constraint I need to address more precisely. A single Salesforce org on Hyperforce in an EU region can satisfy data-at-rest residency, but I need to audit the full data flow. Email services, Einstein AI features, and any AppExchange packages may process data outside the EU region. If the GDPR requirements are strict enough that no EU personal data can ever transit US infrastructure — including during processing — then multi-org becomes necessary. I would recommend evaluating Hyperforce EU deployment first with a full data flow audit, and only fall back to multi-org if the audit reveals processing routes that violate the residency requirement.”


Q1.3: Standard Reports for Complex Cross-System Analytics

Judge: “You recommended standard reports and dashboards for the executive analytics requirement, but the scenario says executives need cross-system visibility combining Salesforce pipeline data, ERP revenue data, and marketing campaign spend. Standard reports can only query Salesforce data. How does that work?”

What they’re testing: Whether you understand the limits of native reporting when data lives across multiple systems.

Model answer: “You’re right — standard reports are limited to Salesforce data sources and custom report types joining up to four objects. For cross-system executive analytics combining CRM, ERP, and marketing data, I would revise my recommendation. I considered CRM Analytics with external data connectors, but given the multi-source requirement and the executive persona who likely consumes analytics outside Salesforce, Tableau is the better fit. Tableau can connect directly to the data warehouse where ERP and marketing data are consolidated, alongside Salesforce data via the native connector. I would keep standard reports for operational CRM reporting used by sales managers within Salesforce.”


Q1.4: Platform License for Users Needing Case Access

Judge: “You assigned Platform Plus licenses to the operations team, but in the scenario they need to view and create Cases for internal IT support requests. Platform licenses don’t include Case access. How do you handle that?”

What they’re testing: Whether you know the exact object access restrictions on Platform licenses.

Model answer: “That’s correct — Platform licenses only provide access to Accounts, Contacts, and custom objects. Cases are a Service Cloud standard object excluded from Platform licenses. I have two options: upgrade those users to Service Cloud licenses, which adds cost, or redesign the IT support workflow to use a custom object (like ‘IT Request’) instead of Cases. I considered the custom object approach because the operations team is 40 users and IT requests are simpler than customer cases — they don’t need entitlements, milestones, or escalation rules. But if the scenario requires integration with the existing Service Cloud case management used by the IT help desk, then Service Cloud licenses are the correct choice despite the higher per-user cost.”

Type 2: Missed — “You Haven’t Addressed…”

These questions point out a requirement you didn’t cover.

Q2.1: Multi-Currency Implications

Judge: “The scenario mentions the company operates in 12 countries with 8 currencies. You didn’t address multi-currency at all in your architecture. What’s your approach?”

What they’re testing: Whether you understand multi-currency implications on reporting, data model, and integration.

Model answer: “I should have addressed that explicitly. For 8 currencies across 12 countries, I would enable Advanced Currency Management with dated exchange rates, which allows opportunity amounts to reflect the exchange rate at the close date rather than the current rate. This affects reporting — all currency roll-ups and forecasting would use dated rates. For the data model, I need to ensure that any custom currency fields on custom objects are properly tied to the corporate currency for consistent reporting. The integration layer also needs to handle currency conversion — records coming from the ERP must include the ISO currency code, and the middleware should use the same exchange rate source as Salesforce to avoid discrepancies. The trade-off with Advanced Currency Management is that it increases storage and processing overhead for currency-related calculations.”


Q2.2: Document Management Strategy

Judge: “Your solution has the sales team generating proposals and contracts, but you haven’t described where documents are stored, how they’re generated, or how version control works. What’s your document management strategy?”

What they’re testing: Whether you considered the full document lifecycle — generation, storage, versioning, and compliance.

Model answer: “I should have addressed document management explicitly. For proposal and contract generation, I would evaluate an AppExchange document generation solution like Conga or Docusign Gen — building a custom document engine is rarely justified. Generated documents would be stored as Salesforce Files attached to the Opportunity or Contract record, which provides native versioning, sharing, and mobile access. For version control, Salesforce Files natively tracks versions when users upload new copies to the same ContentDocument. For large-scale document storage, I would monitor file storage consumption against the per-user allocation and consider Files Connect to link to an external DMS like SharePoint if storage becomes a cost concern. The e-signature workflow would use DocuSign or Adobe Sign integrated via AppExchange.”


Q2.3: Disaster Recovery and Business Continuity

Judge: “This is a mission-critical system for a financial services company. You haven’t mentioned disaster recovery or business continuity. What happens if Salesforce has a major outage?”

What they’re testing: Whether you understand Salesforce’s shared responsibility model and what additional DR measures an architect should recommend.

Model answer: “For a mission-critical financial services deployment, I would address business continuity at three levels. First, Salesforce provides infrastructure-level DR with multi-AZ redundancy on Hyperforce and a documented disaster recovery process with RPO near zero and RTO of approximately 12 hours for major events. Second, for data protection beyond what Salesforce provides, I would recommend a third-party backup solution like OwnBackup or Grax that exports data to customer-controlled storage on a daily schedule — this protects against accidental data deletion and provides independent recovery capability. Third, for operational continuity during an outage, I would design graceful degradation in the integration layer so that external systems queue transactions when Salesforce is unavailable and replay them on recovery. This is where the MuleSoft integration layer earns its cost — it can buffer and retry.”

Type 3: Suboptimal — “Have You Considered…?”

These questions suggest a potentially better approach.

Q3.1: Multi-Org vs Single Org with Business Unit Isolation

Judge: “You proposed a single org, but the two business units have completely different data models, different sales processes, and different compliance requirements. Wouldn’t multi-org give you cleaner separation?”

What they’re testing: Whether you can defend single-org when challenged, or recognize when multi-org is actually better.

Model answer: “I considered multi-org and chose single-org because the scenario mentions shared customers between the two business units — that is the strongest signal for single-org. Different data models can be handled with separate custom objects and record types. Different sales processes work well with separate sales processes on Opportunity. For the compliance differences, I would use permission sets and field-level security to enforce separation, combined with restriction rules if needed to prevent cross-BU data visibility. The trade-off I accept is operational complexity in managing shared metadata, but I mitigate that with unlocked packages for each business unit’s customizations. If the compliance requirement is truly regulatory isolation where data cannot coexist in the same database — like FedRAMP vs commercial — then I would pivot to multi-org with a MuleSoft integration layer for the shared customer data.”


Q3.2: CRM Analytics vs Reporting Snapshots

Judge: “You’re using CRM Analytics dataflows to track pipeline changes over time. Have you considered Reporting Snapshots? They’re simpler and don’t require a CRM Analytics license.”

What they’re testing: Whether you considered the simpler, cheaper option before jumping to a premium tool.

Model answer: “I considered Reporting Snapshots — they are a valid option for point-in-time data capture at lower cost. The reason I chose CRM Analytics is that the scenario requires not just historical snapshots but interactive trending visualizations, cross-object correlation with win rate by product family, and Einstein Discovery predictions on deal outcomes. Reporting Snapshots capture the data but would still need CRM Analytics or Tableau for the analytical layer. If the requirement were purely ‘show me pipeline value each month for the last 12 months,’ Reporting Snapshots to a custom object with standard reports would be the right choice. But the predictive analytics and multi-dimensional exploration push this into CRM Analytics territory.”


Q3.3: Mobile Publisher vs PWA

Judge: “You recommended Mobile Publisher for the partner portal. Have you considered a PWA on Experience Cloud instead? It avoids app store deployment and update cycles entirely.”

What they’re testing: Whether you understand the trade-offs between native-wrapped and PWA approaches for external users.

Model answer: “I considered PWA and it is a viable alternative. I chose Mobile Publisher because the scenario mentions push notifications as a requirement for deal registration alerts, and the partners expect a branded app store presence for credibility. PWAs support push notifications on both Android and iOS (since iOS 16.4), but the iOS implementation requires users to first add the PWA to their home screen before push notifications can be received — reducing the effective notification audience. The trade-off with Mobile Publisher is the app store review cycle for updates, which I mitigate by keeping the portal functionality server-side so most changes deploy without an app update. If the partner base is predominantly Android-based or the push notification requirement is not critical, PWA would be the simpler choice with zero app store friction.”


Q3.4: Edition Upgrade vs Feature Add-On

Judge: “You recommended upgrading from Enterprise to Unlimited edition for the whole org just to get Shield Platform Encryption. Isn’t that expensive for 500 users? Have you considered just buying Shield as an add-on?”

What they’re testing: Cost-consciousness and license optimization awareness.

Model answer: “That is a valid challenge. The math depends on the per-user price delta between Enterprise and Unlimited versus the Shield add-on cost. For 500 users, upgrading to Unlimited provides Shield plus additional benefits — 5x API calls per user instead of 1,000, 120 MB data storage per user instead of 20 MB, 100 developer sandboxes instead of 25, and Premier Support included. If the scenario’s integration architecture is API-heavy and the data volume requires more storage, the Unlimited upgrade might justify itself beyond just Shield. But if API calls and storage are not concerns, purchasing Shield as an add-on on Enterprise edition is more cost-effective. I would model both options with actual numbers before making the final recommendation.”

Type 4: Rationale Missing — “WHY Did You Choose…?”

These questions probe the reasoning behind a correct decision.

Q4.1: Single-Org Justification

Judge: “Walk me through specifically why you chose single-org for this architecture. What were the deciding factors?”

What they’re testing: Whether you have a structured decision framework, not just a gut feeling.

Model answer: “I evaluated four factors. First, shared customers — the scenario states 60% customer overlap between the two business units, which means single-org gives a unified 360-degree customer view without cross-org integration. Second, reporting — the CFO needs consolidated pipeline reporting across both business units, which is native in a single org but requires a data warehouse in multi-org. Third, identity — single-org means one SSO integration, one role hierarchy, one sharing model. Multi-org would require Salesforce-to-Salesforce or middleware-based data sync, adding latency and failure points. Fourth, cost — one org with Enterprise edition costs less than two orgs with separate license pools. The trade-off I accept is shared governor limits and a single release cycle, which I mitigate with unlocked packages and sandboxes.”


Q4.2: Reporting Tool Selection Reasoning

Judge: “Why did you choose Tableau over CRM Analytics for the executive dashboards? What specific requirement drove that choice?”

What they’re testing: Whether you can articulate the precise threshold that tipped the decision.

Model answer: “The deciding factor was data source breadth. The executive dashboard requirements include Salesforce pipeline data, SAP ERP revenue actuals, Marketo campaign spend, and Snowflake-based customer usage metrics. CRM Analytics can connect to external data via connectors, but it is optimized for Salesforce-centric analytics — the external data pipeline would require significant dataflow engineering and has row limits on external connectors. Tableau connects natively to Snowflake, SAP via JDBC, and Salesforce via the native connector, treating all four data sources as first-class. The executives also do not work in Salesforce daily, so embedding analytics in Lightning pages adds no value for them. CRM Analytics remains the right choice for sales manager dashboards within Salesforce, where users are already in the platform and need operational, record-level analytics.”


Q4.3: Licensing Decision Reasoning

Judge: “You assigned Customer Community Plus licenses to portal users instead of Customer Community. Why the more expensive license?”

What they’re testing: Whether you have a specific technical reason or are over-licensing.

Model answer: “Customer Community Plus was necessary for two specific requirements. First, the portal requires record-level sharing beyond simple account-based access — franchise owners need to see records for their region but not other regions, which requires a role hierarchy and sharing rules. Customer Community licenses do not support role hierarchy or sharing rules — all community users share a single role. Second, the portal needs API access for a mobile app that authenticates portal users and calls Salesforce APIs. Customer Community does not include API access. The cost delta is justified by these two technical requirements. If the portal only needed account-based visibility and no API access, Customer Community would be sufficient.”


Q4.4: Async Processing Pattern Choice

Judge: “You chose Queueable Apex over Batch Apex for the nightly data processing. Why?”

What they’re testing: Whether you understand the trade-offs between async patterns.

Model answer: “The nightly processing handles approximately 5,000 records with complex cross-object logic that needs to maintain state across iterations — specifically, tracking running totals and error counts. Queueable Apex supports passing complex serializable objects as state, while Batch Apex requires Database.Stateful which serializes and deserializes the entire class between each execute call, adding overhead. Queueable also supports chaining — each job processes a subset and enqueues the next, giving me fine-grained control over execution. The trade-off is that Queueable does not have the built-in query locator that Batch provides, so I manage the pagination manually. For 5,000 records this is trivial. If the volume were 500,000+, I would switch to Batch Apex because its QueryLocator handles large datasets more efficiently and the execute method gets its own governor limit context per chunk.”

Type 5: Cascading — “If You Change X, What Happens to Y?”

These questions test cross-domain dependency awareness.

Q5.1: Org Strategy Change Cascading Impact

Judge: “You just acknowledged the EU data residency requirement might force multi-org. If you split into two orgs, what else in your architecture has to change?”

What they’re testing: Whether you can trace the full blast radius of a major architectural change.

Model answer: “Splitting to multi-org cascades through almost every layer. Data model: shared customer records now require a cross-org sync strategy — I would introduce a MuleSoft-mediated hub-and-spoke pattern with the corporate org as the hub and EU org as the spoke. Identity: two orgs need a shared IdP with separate SAML configurations per org, but the same corporate Active Directory as the identity source. Sharing model: each org designs its own sharing model independently, which simplifies each org but adds the integration user permission design. Licensing: two separate license pools, likely increasing total cost by 30-40%. Reporting: the CFO’s consolidated pipeline dashboard now requires either a data warehouse receiving data from both orgs or Tableau connecting to both orgs. Governance: two release cycles, two sandbox strategies, paired sandbox environments for integration testing. This is why I strongly prefer single-org — the cascade cost of multi-org is enormous.”


Q5.2: Edition Downgrade Impact

Judge: “The client pushes back on cost and wants to downgrade from Unlimited to Enterprise edition. What breaks in your architecture?”

What they’re testing: Whether you understand the cascading impact of edition differences.

Model answer: “Three areas are directly impacted. First, API calls drop from 5,000 per user to 1,000 per user per 24 hours. With 500 users and the integration architecture making approximately 800,000 API calls daily, we would need to purchase additional API call packs or redesign integrations to use Bulk API and Change Data Capture instead of individual REST calls. Second, data storage drops from 120 MB per user to 20 MB per user — that is a 50 GB reduction for 500 users, which impacts the data archival timeline and may require earlier migration to Big Objects or external storage. Third, developer sandboxes drop from 100 to 25, which constrains the parallel development environment strategy for the four-team delivery model. Premier Support also becomes an add-on cost instead of included. Shield remains an add-on either way, so that is unchanged.”


Q5.3: Mobile Strategy Change Impact on Integration

Judge: “You just switched from Salesforce Mobile App to Mobile SDK for the field team. How does that change your integration architecture?”

What they’re testing: Whether you understand that mobile architecture changes ripple into integration design.

Model answer: “Moving to Mobile SDK changes the integration layer in three ways. First, authentication shifts from the standard Salesforce Mobile App’s managed OAuth flow to a custom OAuth implementation using the Mobile SDK’s authentication framework — I need a Connected App configured for the mobile client with appropriate scopes and a refresh token strategy for offline scenarios. Second, data synchronization changes from Briefcase’s platform-managed sync to MobileSync framework, which makes direct REST API calls to Salesforce. This increases the API call volume because each device syncs independently, so I need to factor 200 additional API consumers into the 24-hour API limit calculation. Third, error handling becomes my responsibility — the standard mobile app handles network failures gracefully, but with Mobile SDK I need to implement retry logic, conflict resolution for offline edits, and a queue management pattern in SmartStore. The MuleSoft integration layer is not directly impacted since the mobile app calls Salesforce APIs, not the middleware directly.”


Q5.4: Reporting Strategy Change Impact

Judge: “If you switch from CRM Analytics to standard reports for the sales team dashboards to save on licensing, what impact does that have on the data model?”

What they’re testing: Whether you recognize that analytics tool changes can cascade into data model decisions.

Model answer: “Standard reports have a four-object join limit in custom report types, which means the five-object join in my CRM Analytics dataflow for the pipeline-by-territory-by-product dashboard cannot be replicated directly. I would need to denormalize some data — either adding formula fields or roll-up summaries to reduce the join count, or creating a summary custom object that a scheduled Flow populates nightly with pre-aggregated data. This denormalization adds storage overhead and introduces a data freshness lag that CRM Analytics does not have. The historical trending requirement also changes — CRM Analytics dataflows captured weekly snapshots; without it, I would need Reporting Snapshots writing to a custom object, which creates records that consume data storage. The trade-off is lower license cost but higher data model complexity and reduced analytical flexibility.”

This is a personal study site for Salesforce CTA exam preparation. Built with AI assistance. Not affiliated with Salesforce.