How-to

SOC 2 Readiness Assessment: Self-Assessment Quiz for Indian SaaS

Five-minute SOC 2 readiness self-assessment for Indian SaaS and BFSI teams — gauge your gap count, timeline, and budget before engaging a Bangalore auditor.

API4SOC2 Editorial · 28 June 2026 · 13 min read

A SOC 2 readiness assessment should be the first step before committing ₹6–₹22 lakh and 12–16 weeks to a SOC 2 Type II audit, because knowing your gap count upfront determines realistic budget and timeline. This self-assessment quiz covers the twelve control domains that auditors evaluate most closely. Score yourself, tally the gaps, and use the result to scope a realistic readiness programme.

The article explains each domain, what “ready” looks like, and how gap count translates to timeline and cost. The interactive companion is available at the end of the post.

The twelve SOC 2 readiness domains

Domain 1: Security policy and governance

Ready: A board-approved information security policy exists, is reviewed annually, and is communicated to all employees.

Gap: No documented policy, or policy is outdated and not acknowledged.

Domain 2: Access control

Ready: Role-based access control is implemented, privilege reviews happen quarterly, and MFA is enforced on all production access.

Gap: Shared credentials exist, access reviews are ad-hoc, or MFA is not enforced.

Domain 3: Change management

Ready: All production changes are approved, tested, and documented in a ticketing system with rollback plans.

Gap: Hotfixes bypass process, or change records are incomplete.

Domain 4: System operations and monitoring

Ready: Logs are centralised, alerting is configured for critical events, and incident response playbooks exist.

Gap: No SIEM, no alerting, or logs are not retained for the observation period.

Domain 5: Risk assessment

Ready: A formal risk assessment is conducted annually, with documented mitigations and ownership.

Gap: Risk assessment is informal or not documented.

Domain 6: Vendor management

Ready: Critical vendors are assessed for security posture, and DPAs are in place for vendors processing customer data.

Gap: No vendor security reviews, or contracts lack security clauses.

Domain 7: Data classification and handling

Ready: Data is classified by sensitivity, and handling rules are enforced by technical controls.

Gap: No classification scheme, or controls do not match classification.

Domain 8: Encryption

Ready: Data is encrypted at rest and in transit using industry-standard algorithms and key management.

Gap: Encryption is missing for some data classes, or key management is manual.

Domain 9: Backup and recovery

Ready: Backups are automated, tested quarterly, and stored in a geographically separate location.

Gap: Backups are manual, untested, or co-located with primary data.

Domain 10: HR security

Ready: Background checks are conducted, security awareness training is annual, and termination access revocation is within 24 hours.

Gap: No background checks, no training, or access revocation is delayed.

Domain 11: Physical security

Ready: Offices and data centres have controlled access, visitor logs, and environmental monitoring.

Gap: No visitor management, or cloud-only with no physical controls documented.

Domain 12: Incident response

Ready: An incident-response plan is documented, tested annually, and includes notification timelines.

Gap: No plan, or plan has not been tested.

Scoring and gap interpretation

Gap countReadiness levelTypical readiness timelineEstimated readiness fee (INR)
0–3High4–6 weeks₹1,50,000 – ₹2,50,000
4–7Medium8–12 weeks₹3,00,000 – ₹6,00,000
8–12Low16–24 weeks₹6,00,000 – ₹12,00,000

From quiz to audit: what happens next

  1. Gap closure — Address the gaps identified in the quiz. We provide a prioritised roadmap.
  2. Evidence collection — Gather screenshots, policies, logs, and configuration records for the observation period.
  3. Type I audit (optional) — A design-effectiveness opinion that accelerates Type II readiness.
  4. Type II audit — Operating-effectiveness opinion over 6–12 months.
  5. Report issuance — Clean opinion, qualified opinion, or adverse opinion.

Cross-framework mapping

  • ISO 27001:2022 — The twelve domains map to Annex A controls. A combined readiness assessment is available. See our ISO 27001 service page.
  • DPDP Act 2023 — Access control, encryption, and incident response overlap with DPDP obligations. See our DPDP service page.

Industry-specific readiness considerations

The twelve-domain framework applies broadly, but some domains weight differently by industry. Below is the calibration we use during readiness assessments for the verticals we deliver into most often.

B2B SaaS — focus on access, encryption, and customer-data isolation

For Bangalore SaaS exporters, the highest-leverage domains are access control (Domain 2), encryption (Domain 8), and data classification (Domain 7). The customer-data-isolation question — whether tenant boundaries are enforced technically or only logically — frequently surfaces gaps in multi-tenant architectures. SaaS readiness assessments typically identify 4–8 gaps even for well-engineered platforms.

BFSI — focus on change management and incident response

BFSI organisations typically have stronger access control and encryption baselines but weaker change-management discipline relative to the SOC 2 standard. Domain 3 (change management) and Domain 12 (incident response) are the most-common gap areas, particularly around documented evidence of change approval and tested incident playbooks. BFSI readiness assessments typically identify 3–6 gaps.

HealthTech — focus on data classification and HR security

HealthTech platforms handling PHI typically have strong encryption and access controls but underdeveloped data-classification frameworks (Domain 7) and inconsistent HR-security practices around background checks and access termination (Domain 10). HealthTech readiness assessments typically identify 5–9 gaps.

EdTech — focus on data minimisation and HR security

EdTech with children’s data exposure has unique gaps in data minimisation (Domain 7), retention discipline (also Domain 7), and access controls for support staff (Domain 2). The DPDP children’s-data overlay adds gaps not strictly within the SOC 2 framework but operationally relevant. EdTech readiness assessments typically identify 6–10 gaps.

Crypto exchanges — focus on system operations and incident response

Indian crypto exchanges typically have advanced cryptography and access controls but underdeveloped operational monitoring (Domain 4) relative to the SOC 2 standard, particularly around continuous control validation and 24×7 SOC capability. Domain 12 (incident response) is consistently a gap area given the threat-actor sophistication targeting the sector. Crypto exchange readiness assessments typically identify 4–7 gaps.

From quiz to roadmap — how to convert gap counts into a programme

A gap count alone is not a programme. The conversion from gap count to actionable roadmap involves three additional decisions.

Decision 1: Trust Services Criteria scope

Security TSC is mandatory for any SOC 2 engagement. Other TSCs are optional and chosen based on buyer expectations:

  • Availability — for platforms where uptime is part of the customer commitment. Most B2B SaaS includes this.
  • Confidentiality — for platforms handling customer-confidential data beyond the public privacy notice. Most B2B SaaS does not include this; it is more relevant for legal-tech and enterprise data platforms.
  • Processing Integrity — for platforms where transaction integrity matters. Most fintech platforms include this.
  • Privacy — for platforms with consumer-direct data relationships. EdTech and consumer apps typically include this.

Each additional TSC adds approximately 15–25% to the engagement cost. The decision should follow buyer demand, not internal preference.

Decision 2: Observation period length

The minimum observation period for SOC 2 Type II is 6 months; the maximum is 12 months. Buyer expectations vary:

  • 6-month observation — accepted by most US enterprise buyers; appropriate for first-time attestations.
  • 9-month observation — increasingly requested by Fortune 500 buyers; demonstrates control consistency over a longer window.
  • 12-month observation — required by some highly-regulated buyers (US Federal, large healthcare); demonstrates full annual cycle coverage.

The longer the observation window, the more evidence to collect but also the stronger the buyer signal.

Decision 3: Tooling versus internal evidence

Most Bangalore engagements use a GRC tool (Vanta, Drata, Sprinto) to automate evidence collection. The annual tool cost is ₹1,00,000–₹5,00,000. The trade-off is that tooling automates roughly 60% of evidence collection, leaving 40% as manual workflows. Organisations without tooling typically need an additional 1–2 internal-team-weeks per audit cycle.

Common readiness-assessment failures

Beyond the domain-specific gaps, three meta-failures recur in Bangalore readiness assessments. Treating the assessment as a one-time exercise. Readiness should be re-evaluated semi-annually; control posture drifts continuously. Skipping the documentation review. Strong technical controls without documented policies still fail audit; the auditor evaluates documented controls, not undocumented ones. Ignoring third-party risk. The Vendor Management domain (Domain 6) is consistently under-scored because most organisations have not formally assessed their critical processors against the SOC 2 standard.

Practical next steps

If you scored 8+ gaps, start with a formal readiness assessment before scoping the audit. For Indian organisations that want hands-on help, our SOC 2 compliance service includes readiness, evidence collection, and audit ownership. See also our ISO 27001 certification page if you are running a combined programme. If you scored 0–3 gaps, you may be ready to begin the observation period immediately. If you want the interactive version of this quiz with auto-scoring and a downloadable gap report, see the companion tool linked below.

For organisations that want a thirty-minute scoping conversation with a partner, the contact form in the site footer books the call directly. We commit to written scope, fixed price in INR, and direct partner-level accountability through the engagement.

SOC 2 readiness FAQ

How accurate is the self-assessment relative to a paid readiness assessment? Reasonably accurate for the gap-count-band determination. A paid readiness assessment adds depth — control-by-control review, evidence-collection guidance, and a specific remediation roadmap. Use the self-assessment to scope; use the paid assessment to execute.

Can I retake the quiz after closing gaps? Yes. We recommend retaking the quiz quarterly through the readiness phase to track progress.

Does scoring 0 gaps mean I’m SOC 2 certified? No. Scoring 0 gaps means you’re ready to begin the observation period. The Type II report comes after the observation window plus fieldwork.

What if I can’t honestly score myself? Common gaps in self-assessments: under-scoring access controls (they look fine until you check evidence retention); over-scoring incident response (everyone says they have a plan, few have tested it). When in doubt, score conservatively.

Can I use this quiz for ISO 27001 readiness too? With modifications. The twelve domains overlap substantially with ISO 27001 Annex A, but ISO has additional ISMS-specific requirements (clauses 4–10) that this quiz doesn’t cover.

How does the quiz score map to engagement timeline? 0–3 gaps: 4–6 weeks readiness, 12 weeks total to Type II. 4–7 gaps: 8–12 weeks readiness, 16 weeks total. 8–12 gaps: 16–24 weeks readiness, 28 weeks total.

Should I take the quiz before or after engaging an auditor? Before. The quiz informs your auditor selection and pricing negotiation. Walk into the auditor conversation with a sense of your gap count.

What if the quiz reveals I’m not ready? That’s the point. Use the result to scope a readiness programme before committing to the audit. Better to delay 8 weeks for readiness than to fail the audit.

Does the quiz cover all 5 Trust Services Criteria? Primarily Security TSC (which is mandatory). Availability and Confidentiality are partially covered through Domains 4, 8, and 9. Processing Integrity and Privacy require additional assessment beyond this quiz.

Can a board approve SOC 2 budget based on the quiz alone? For directional budgeting, yes. For final commitment, a paid readiness assessment provides the precision needed for budget locking.

How do I share the quiz results with stakeholders? The downloadable PDF version includes a one-page summary slide suitable for board or executive briefing.

What if my gap count changes between starting and finishing the engagement? Normal. Re-evaluate quarterly. New product features, hires, or vendor changes create new gaps; remediation closes existing ones. The quiz is a snapshot, not a static measure.

Anatomy of a clean Type II report

For Bangalore SaaS founders evaluating SOC 2 outcomes, knowing what a clean Type II report looks like helps calibrate expectations. A clean opinion includes: the CPA firm’s opinion paragraph (no qualifications, no exceptions), management’s assertion about the system, the system description (typically 30–50 pages of detail covering the control environment, processes, and infrastructure), the trust service criteria with control activities (typically 80–150 controls depending on TSC scope), and the auditor’s tests of operating effectiveness (typically 20–40 control tests with results).

Common findings that produce qualifications rather than clean opinions: missing evidence for one or more controls during the observation period; control failures discovered during testing without evidence of timely remediation; documentation gaps that auditors cannot evaluate; control design weaknesses identified during fieldwork. Most first-time Bangalore SaaS engagements that fail to produce clean opinions do so on documentation rather than control failure — the controls operate, but the evidence of operation is incomplete.

Pre-audit readiness boost — final-quarter activities

For organisations entering the final quarter before SOC 2 fieldwork, specific pre-audit activities materially improve outcomes.

Evidence dry-run. Pull a sample of evidence for each control as if the auditor were testing today. Identify gaps; remediate before fieldwork. Most teams discover surprising gaps through this exercise.

Management interview prep. Auditors interview specific control owners. Prep the interview list, prepare control owners, and run mock interviews. Unprepared control owners produce inconsistent answers that auditors must investigate further.

Documentation pass-through. Review every policy and procedure for currency. Stale documentation (not reviewed in 12+ months) is a common finding even when controls operate correctly.

Vendor evidence collection. SOC 2 expects evidence of vendor management. Collect SOC 2 reports from critical vendors before audit; auditors will ask. Vendors who don’t produce SOC 2 reports require alternative evidence (contractual provisions, security questionnaires).

Sample-size planning. Auditors sample evidence; the sample sizes for each control are defined in the AICPA standard. Knowing the expected sample size lets you pre-collect that evidence rather than scrambling during fieldwork.

Beyond the quiz — building genuine SOC 2 maturity

The quiz scores readiness; maturity is the longer-term destination. Specific practices distinguish nominally-ready organisations from genuinely-mature ones.

Quarterly self-assessment. Re-running the readiness quiz quarterly catches drift before it becomes a finding. Mature organisations institutionalise this cadence.

Control-owner accountability. Each SOC 2 control has a named owner accountable for its operation. Owners review their controls quarterly; ownership reassignment happens at role transitions. Without ownership, controls drift.

Evidence automation. Mature organisations automate evidence collection — log shipping to dedicated archives, periodic snapshots of access lists, documented change tickets. Without automation, every audit cycle requires fresh evidence collection effort.

Continuous threat monitoring. Beyond compliance evidence, mature organisations monitor for actual security threats — anomaly detection, threat-intelligence integration, behavioural analytics. The CC7.x criteria reward this maturity.

Vendor lifecycle management. Mature organisations track vendor security posture continuously, not just at onboarding. Annual vendor security reviews, monitoring of vendor security incidents, structured offboarding when vendors are sunset.

Incident-response practice. Mature organisations practise incident response through structured tabletops, real-world simulations, and post-incident reviews. The practiced muscle is dramatically more effective than the documented playbook.

Metric reporting. Mature organisations report security metrics to executive leadership with consistent cadence. Metrics that matter: time-to-patch for critical vulnerabilities, percentage of accounts with MFA enforced, time-to-revoke access on departure, frequency of privileged-access reviews.

When to engage external auditor — timing considerations

The decision of when to engage the SOC 2 auditor affects engagement quality.

Pre-readiness engagement. Engaging the auditor before completing readiness work allows the auditor to provide guidance during gap closure. Cost: somewhat higher (guidance time billed). Benefit: smoother fieldwork execution.

Post-readiness engagement. Engaging after completing readiness allows clear scope-of-work definition. Cost: somewhat lower. Risk: gaps discovered during fieldwork.

Most operationally-rational pattern. Engage the auditor for a brief gap-assessment review at end of readiness phase, then formal engagement for the observation period and fieldwork. This combines guidance benefit with clear scope.

Common surprises during the SOC 2 observation period

For organisations entering the observation period for the first time, certain surprises emerge that the readiness phase may not anticipate.

Engineering team load. Evidence collection during observation typically consumes 8-15 hours per week of engineering time. The cumulative load over a 6-month observation period is substantial. Plan capacity accordingly.

Audit-evidence freshness. Evidence gets stale; what was current in month 1 may need refresh by month 6. Continuous evidence collection is materially easier than periodic catch-up.

Customer support load for evidence requests. Auditors during fieldwork ask for specific evidence; support requests during fieldwork can disrupt customer-facing teams. Coordinate auditor questions through a single point-of-contact.

Vendor-evidence gaps. Critical vendors should produce SOC 2 reports or alternative evidence; if they don’t, the gap requires alternative documentation. Identify gaps early.

Documentation drift. Policies updated without version control or sign-off cause audit issues. Maintain version control and sign-off discipline throughout the observation period.

Personnel changes. Departing employees with access to in-scope systems require documented access termination. Joiner-mover-leaver workflow is one of the most-tested SOC 2 controls.

Configuration drift. Production configurations change continuously; without IaC discipline, the configuration at audit time may differ from what was documented. Strong IaC practice prevents this.

Customer-data classification. Classification disagreements between teams can produce inconsistent control application. Clear data-classification policy with examples reduces ambiguity.

SOC 2 readiness investment ROI

For Bangalore SaaS founders evaluating SOC 2 readiness investment ROI, the calculation has multiple dimensions.

Pipeline conversion impact. Enterprise prospects gated by SOC 2 demand convert at 30-50% higher rates with SOC 2 in hand. For SaaS companies with material enterprise pipeline, conversion impact alone justifies investment.

Deal-cycle compression. Average enterprise deal cycle compresses by 4-8 weeks with SOC 2 ready. Time-to-revenue improvement is real economic value.

Procurement-friction reduction. Vendor security questionnaires consume engineering time; SOC 2 reports answer most questionnaire items, reducing per-deal friction.

Renewals and expansion. Existing customer renewals reference compliance posture; SOC 2 maintenance signals operational maturity affecting renewal terms and expansion conversations.

Competitive positioning. Markets where competitors don’t yet have SOC 2 reward early-mover advantage; markets where SOC 2 is universal reward those without it adversely.

For most Bangalore Series-A+ SaaS companies with US enterprise pipeline, SOC 2 investment ROI exceeds 3× within 24 months when measured comprehensively.

Final notes on readiness preparation

The readiness phase is where the SOC 2 outcome is largely determined. Mature organisations invest deliberately in this phase — comprehensive gap closure, evidence-collection automation, control-owner training, dry-run management interviews, and proactive auditor coordination. Cutting corners during readiness produces problems during fieldwork. Investment in readiness pays back through smoother fieldwork execution, fewer findings, and ultimately a clean opinion that buyers accept without follow-up questions.

AE
API4SOC2 Editorial
Compliance Practice Lead, Bengaluru
Bengaluru-based partner at API4SOC2. CERT-In empanelled lead auditor with 12+ years of compliance practice across Indian BFSI, fintech, and SaaS engagements. Has signed off on 80+ SOC 2 and ISO 27001 attestations.
Ready to scope this engagement?

Book a thirty-minute scoping call.

Tell us your framework, your stack and the deadline. You leave the call with a written scope, a fixed price in INR, and a kick-off invite.