DevOps Quick Reference: Environments, CI/CD, Testing & Governance
All-in-one DevOps reference for CTA scenarios. Environments, CI/CD pipelines, testing strategy, and governance — combined for rapid review. Tables over prose — this is your pre-board cheat sheet.
Sandbox Comparison Matrix
| Developer | Developer Pro | Partial Copy | Full Copy | Scratch Org | |
|---|---|---|---|---|---|
| Metadata | Full | Full | Full | Full | Source-pushed |
| Data | None | None | Template sample | Full production | Scripted |
| Storage | 200 MB | 1 GB | 5 GB | Same as prod | 200 MB |
| Refresh | 1 day | 1 day | 5 days | 29 days | Recreate (ephemeral) |
| Max lifetime | Persistent | Persistent | Persistent | Persistent | 30 days |
| Performance testing | No | No | No | Yes | No |
| Production data | No | No | Sampled | Yes (mask it!) | No |
| CI/CD use | Manual dev | Manual dev | Integration test | Staging/perf | Pipeline validation |
Environment-to-Test Mapping
| Environment | Tests That Run There | Data Needed |
|---|---|---|
| Scratch Org | Apex unit tests, LWC Jest, static analysis (PMD/ESLint) | Test data factory scripts |
| SIT (Partial Copy) | Integration tests, cross-team validation, E2E smoke | Sandbox template data |
| UAT (Partial Copy) | Business user acceptance tests | Representative (masked) data |
| Staging (Full Copy) | Performance tests, final regression, smoke tests | Production-scale (masked) data |
| Production | Post-deployment smoke tests only | Real data |
CI/CD Pipeline — Step by Step
flowchart TD
A["Developer commits\nto feature branch"] --> B["Pull Request created"]
B --> C["CI Pipeline triggers"]
C --> D["Static Analysis\n(PMD + ESLint)"]
D --> E{Pass?}
E -->|No| F["Block merge\nNotify developer"]
E -->|Yes| G["Create scratch org"]
G --> H["Deploy source"]
H --> I["Run Apex tests\n(85%+ coverage)"]
I --> J["Run LWC Jest tests"]
J --> K{All pass?}
K -->|No| F
K -->|Yes| L["Code review\napproved"]
L --> M["Merge to develop"]
M --> N["Deploy to SIT"]
N --> O["Integration tests"]
O --> P["Deploy to UAT"]
P --> Q["UAT sign-off"]
Q --> R["Deploy to Staging"]
R --> S["Smoke + perf tests"]
S --> T["Deploy to Production"]
style F fill:#e76f51,stroke:#c45a3f,color:#fff
style T fill:#2d6a4f,stroke:#1b4332,color:#fff
style K fill:#f4a261,stroke:#d4823e,color:#000
CI/CD Tool Comparison
| Tool | Type | Strength | Cost | CTA When to Recommend |
|---|---|---|---|---|
| GitHub Actions | General CI/CD | Flexible, free for public repos | Free/Paid | Teams already on GitHub |
| Copado | SF-native | Built for SF, compliance, user stories | Expensive | Regulated industries, admin-friendly DevOps |
| Gearset | SF DevOps | Best diff/merge/compare tooling | Moderate | Teams needing metadata comparison |
| AutoRABIT | SF DevOps | Full suite with backup/restore | Expensive | Enterprise with backup requirements |
| Azure DevOps | General CI/CD | Enterprise integration, boards | Paid | Microsoft-centric enterprises |
| GitLab CI | General CI/CD | Integrated repo + CI, self-hosted | Free/Paid | Self-hosted requirement |
CTA Tool Selection Logic
Don’t recommend a tool — recommend criteria. “We need CI/CD that supports scratch org creation, automated Apex testing, and integrates with our existing Git provider. Given the team uses GitHub, GitHub Actions with the sf CLI is the most cost-effective choice.” The board cares about your reasoning, not brand loyalty.
Deployment Mechanisms Compared
| Mechanism | Rollback | Version History | CI/CD | Dependency Mgmt | CTA Verdict |
|---|---|---|---|---|---|
| Change Sets | None | None | No | None | Legacy — migrate away |
| Salesforce CLI | Via source control | Git history | Yes | Manual | Enterprise standard |
| Unlocked Packages | Install previous version | Package versions | Yes | Declared | Best for modular orgs |
| Managed Packages | Install previous version | Package versions | Yes | Declared | ISV only |
| DevOps Center | Limited | Basic tracking | Partial | None | Transitional tool |
Unlocked Package Architecture
flowchart TD
CORE["Core Package\n(Objects, Fields,\nPermission Sets)"] --> SALES["Sales Package\n(Flows, LWC, Apex)"]
CORE --> SERVICE["Service Package\n(Flows, LWC, Apex)"]
CORE --> INTEG["Integration Package\n(Named Creds,\nExt Services, Apex)"]
style CORE fill:#1a535c,stroke:#0d3b44,color:#fff
style SALES fill:#2d6a4f,stroke:#1b4332,color:#fff
style SERVICE fill:#4ecdc4,stroke:#3ab5ad,color:#000
style INTEG fill:#f4a261,stroke:#d4823e,color:#000
Feature Flags — Implementation Options
| Mechanism | Toggle Speed | Scope | Deployment? | Best For |
|---|---|---|---|---|
| Custom Metadata Type | Deploy required | Org-wide | Yes (metadata) | Feature gates across environments |
| Custom Settings | Instant (DML) | Per user/profile | No (data) | Quick kill switches, user-level control |
| Custom Permissions | Permission Set assign | Per user | Yes (metadata) | User-level feature access |
Testing Quick Reference
Test Data Strategies
| Strategy | Description | When |
|---|---|---|
| Test Data Factory | Centralized @isTest utility class | Always — default approach |
@TestSetup | Creates data once for all test methods | Multiple methods need same base data |
| Static Resources | CSV loaded as test data | Bulk tests with specific patterns |
SeeAllData=true | Tests see real org data | Almost NEVER — only for specific platform features |
Apex Testing Checklist
- Test data factory pattern (no hardcoded data in tests)
- Positive tests (happy path)
- Negative tests (error handling, invalid data)
- Bulk tests (200 records — trigger bulkification)
- Boundary tests (0 records, null values, max values)
-
Test.startTest()/Test.stopTest()for governor limit reset -
HttpCalloutMockfor all external callouts - Meaningful assertions (not just “it didn’t crash”)
- 85%+ coverage target (not 75%)
Test Automation Tools
| Tool | Layer | Automated? | CTA Notes |
|---|---|---|---|
| Apex Testing Framework | Unit + Integration | Fully | Foundation — always include |
Jest (@salesforce/sfdx-lwc-jest) | Component | Fully | Required for any LWC |
| PMD | Static analysis (Apex) | Fully | Free, catches common issues |
| ESLint | Static analysis (JS) | Fully | Standard for LWC JavaScript |
| Provar | E2E (SF-native) | Automated | Understands SF DOM |
| Copado Robotic Testing | E2E (SF-native) | Automated | No-code test creation |
Governance Cheat Sheet
RACI — Simplified CTA Version
| Activity | CTA/EA | Dev Lead | Admin Lead | PM |
|---|---|---|---|---|
| Architecture decisions | A, R | C | C | I |
| Code standards | A | R | I | I |
| Declarative standards | A | C | R | I |
| Production deployments | C | R | R | A |
| Vendor evaluation | A, R | C | C | C |
| Release planning | C | R | R | A |
R = Responsible, A = Accountable, C = Consulted, I = Informed
Change Classification
| Type | Risk | Approval | Example |
|---|---|---|---|
| Standard | Low | Pre-approved | Field label, report |
| Normal | Medium | CAB review | New automation, integration |
| Emergency | High | Expedited + post-review | Production bug, security patch |
| Major | High | Full CAB + ARB | New cloud, data migration |
Compliance — What to Mention at the Board
| Framework | Key SF Requirement | CTA Must-Say |
|---|---|---|
| GDPR | Right to be forgotten, consent management | ”Data deletion processes, anonymization scripts” |
| HIPAA | PHI encryption, audit logging | ”Shield Encryption, Event Monitoring, data masking in sandboxes” |
| SOX | Audit trails, segregation of duties | ”Field Audit Trail, approval process segregation” |
| PCI-DSS | Cardholder data protection | ”Never store card data in SF — use payment gateway integration” |
Compliance Overrides Everything
In regulated scenarios, compliance requirements override all architectural preferences. Do not recommend a technically superior solution that violates compliance. Document compliance as architectural constraints.
Reverse-Engineered Use Cases
Scenario 1: Insurance Company — Legacy Modernization
Situation: Insurance company has 15 developers, mix of change sets and manual deployments, no CI/CD, 3 production deployments per year. New CTO wants DevOps modernization.
What you’d do: Phased approach.
- Phase 1 (Month 1-2): Introduce Git (GitHub), establish branching strategy (modified GitFlow), move all metadata to source control. Keep change sets temporarily for admin changes.
- Phase 2 (Month 3-4): Set up GitHub Actions pipeline — scratch org validation on every PR. Automated Apex tests + PMD static analysis as merge gates.
- Phase 3 (Month 5-6): Migrate from change sets to CLI deployments. Introduce unlocked packages (Core, Sales, Claims, Integration). Increase release cadence to monthly.
Governance: Establish ARB (monthly reviews). CAB for all production deployments. Hybrid CoE with central architecture team and BU delivery squads.
Scenario 2: Retail — High-Volume Seasonal Deployments
Situation: Retailer with Black Friday traffic spikes. Needs zero-downtime deployments, feature flags for seasonal promotions, and performance testing with 10M product records.
What you’d do: Feature flags via Custom Metadata Types for all promotional features — deploy code weeks early, activate flags on launch day. Full Copy sandbox for performance testing with production-scale data (10M records). Perf test every SOQL query and batch job.
Deployment: Phased by user group (pilot store first, then regional rollout). Kill switch via Custom Settings for instant disable if issues detected.
Testing: Automated regression suite runs on every PR. Load testing on Full Copy sandbox simulating Black Friday API volume. UAT with business team 2 weeks before promotion launch.
Scenario 3: Government Agency — FedRAMP Compliance
Situation: Federal agency implementing Service Cloud. FedRAMP compliance requires formal change management, audit trails, and government cloud hosting.
What you’d do: Waterfall-hybrid methodology (formal milestones with agile sprints within each phase) to satisfy audit requirements. Copado for CI/CD — its compliance features generate deployment artifacts needed for audits. Shield Event Monitoring for all admin actions.
Environment strategy: Government Cloud with dedicated infrastructure. Full Copy sandbox for staging with complete data masking. No production data in any other environment. Post-copy scripts disable all external integrations and mask all PII.
Governance: Formal CAB with documented approval for every change. ARB quarterly reviews. All architectural decisions recorded as ADRs.
DevOps Maturity Quick Assessment
| Level | Characteristics | CTA Recommendation |
|---|---|---|
| Level 1: Ad Hoc | Change sets, manual testing, no source control | Introduce Git + CLI, basic CI |
| Level 2: Managed | Source control, some CI, manual deploys | Automate pipeline, add scratch orgs |
| Level 3: Defined | CI/CD pipeline, automated testing, sandboxes | Add unlocked packages, performance testing |
| Level 4: Measured | Metrics-driven, feature flags, fast rollback | Optimize release cadence, continuous deployment |
| Level 5: Optimized | Full automation, canary releases, self-healing | Maintain and innovate |
Key Anti-Patterns
| Anti-Pattern | Fix |
|---|---|
| Deploying to prod on Fridays | Tuesday/Thursday deployment windows (off-peak) |
| No rollback plan | Document rollback for every deployment |
| Full Copy sandbox with unmasked PII | SandboxPostCopy Apex for mandatory masking |
| One developer with all the knowledge | Cross-training, documented runbooks, RACI |
| Skipping Staging environment | Staging is mandatory for performance validation |
| No destructive change review | Destructive changes need code review + CAB approval |
| ”75% coverage is enough” | 85%+ with meaningful assertions |
Deep Dive Links
- Environment Strategy — Full sandbox analysis and topology patterns
- CI/CD & Deployment — Pipeline architecture, branching, unlocked packages
- Testing Strategy — Test pyramid, automation tools, UAT planning
- Governance Model — ARB, CAB, CoE models, ADRs, compliance
- Risk Management — Risk register, probability-impact matrix
- Trade-Offs — Agile vs Waterfall, change sets vs CLI, and more
- D6 Quick Reference Overview — Back to overview