Skip to content
IC Inline Code
All posts

APRA CPS 234

The APRA CPS 234 audit: what auditors actually look for

Most CPS 234 audit findings are not surprises to the audited entity — they are gaps the entity already knew about and chose to defer. The audit just makes them visible. Here's the structure of the audit and the findings that recur.

Mathew Sayed Mathew Sayed
· · 7 min read

CPS 234 audits — both the formal independent audit required under paragraph 38 and the broader CPS 234-aligned reviews requested by audit committees — produce a recognisable set of findings across mid-market financial services. Not because the entities are unusually weak, but because CPS 234 imposes substantive operational expectations that most organisations have implemented partially.

This post is what we tell clients before a CPS 234 audit. The structure of the audit, the findings that recur, and what an “audit-ready” posture looks like in practice.

What CPS 234 actually requires

The standard is short and dense. The substantive requirements:

  • Information security capability commensurate with the size and risk profile of the entity (paragraph 14).
  • Roles and responsibilities for information security clearly defined (paragraph 16-21).
  • Information security policy framework that addresses the relevant matters (paragraph 22).
  • Information assets identified and classified by criticality and sensitivity (paragraph 23).
  • Implementation of information security controls to protect information assets (paragraphs 25-29), including for information assets managed by third parties (paragraph 26).
  • Incident management including detection, response, recovery, post-incident review (paragraph 27).
  • Testing of information security controls through systematic, regular, and risk-based program (paragraph 30).
  • Internal audit review of information security controls (paragraphs 32-37).
  • APRA notification within 72 hours of becoming aware of material information security incidents (paragraph 35).
  • External audit attestation of compliance with the standard (paragraph 38).

Each of these has implementation depth. The audit assesses whether the entity meets each one in practice.

The structure of the audit

A CPS 234 audit — whether the formal independent audit, an internal audit review, or a pre-audit gap assessment — follows a recognisable pattern.

Phase 1: Document review

The auditor requests:

  • Information security policy and supporting policies.
  • Information asset register / classification scheme.
  • Risk register, particularly for information security risks.
  • Incident response plan.
  • Testing program documentation.
  • Training records.
  • Service provider register and arrangements.
  • Recent incident records.
  • Recent testing records.
  • Recent management reporting on information security.
  • Risk function and audit function reports on information security.

The document review tells the auditor what the entity says it does. The next phase tests whether it actually does.

Phase 2: Walkthroughs and interviews

For each substantive control area, the auditor interviews the responsible person and walks through the control:

  • Show me how a new employee gets access. (Identity provisioning.)
  • Show me how access is reviewed. (Periodic access review.)
  • Show me how a security incident is escalated. (Incident response process.)
  • Show me how a vendor is approved. (Service provider management.)
  • Show me how a control is tested. (Testing program.)

The walkthrough surfaces gaps between policy and practice. Most CPS 234 findings come from this phase.

Phase 3: Sample testing

For each control, the auditor selects a sample and tests:

  • Show me the access review for these 10 systems. Were the reviews performed, signed, and acted on?
  • Show me the incident response records for these 5 incidents. Were the response steps followed, the escalations made, the lessons captured?
  • Show me the testing for these 3 controls. Was the testing performed, the findings documented, the remediation tracked?

Sample testing finds the controls where policy and practice diverge in specific cases.

Phase 4: Reporting

The findings, by severity:

  • Material — fundamental gap between the standard’s requirement and the entity’s practice. Often triggers regulatory attention.
  • Significant — substantial gap that requires remediation in the audit cycle.
  • Moderate — notable gap but not material. Tracked through to remediation.
  • Minor — tightening opportunities.

For the formal independent audit, the findings are reported to the entity’s board and to APRA (paragraph 38).

Findings that recur

Across the audits we’ve assisted with, the findings cluster.

1. The information asset register is incomplete or stale

Paragraph 23 requires identification and classification of information assets. The audit looks for:

  • A register that exists.
  • That covers the material data, systems, and services.
  • That has classifications applied (public/internal/confidential/restricted or equivalent).
  • That is reviewed and updated.

The common gaps: the register exists but excludes shadow IT and SaaS, the classifications are theoretical (no operational consequence), the review cadence is not actually followed.

The fix: a register that covers the material assets, with classifications that drive control decisions, reviewed at the cadence specified in policy.

2. Service provider arrangements lack substantive content

Paragraph 26 requires that information security controls extend to assets managed by third parties. The audit looks for:

  • Service provider register identifying which providers handle information assets.
  • Contractual arrangements with information security obligations.
  • Ongoing monitoring of service provider information security.
  • Arrangements for managing service provider incidents.

The common gaps: the register exists but is not differentiated by tier, contracts have generic security clauses without enforcement teeth, ongoing monitoring is annual questionnaires only, incident arrangements are untested.

The fix: tiered service provider management (covered separately) with substantive contracts and continuous monitoring.

3. Testing program is not evidence-driven

Paragraph 30 requires a systematic testing program. The audit looks for:

  • A documented testing program.
  • Evidence of testing performed (penetration tests, vulnerability scans, control effectiveness testing).
  • Findings documented and tracked to remediation.
  • Risk-based selection of what’s tested.

The common gaps: the testing program exists on paper but the evidence is sparse; testing is annual penetration test only; findings exist but remediation tracking has stalled; risk-based selection is not explicit.

The fix: a testing program that visibly produces evidence — annual penetration tests on critical systems, regular vulnerability scanning with remediation tracking, periodic control effectiveness testing, risk-based selection documented.

4. Incident management lacks operational evidence

Paragraph 27 requires incident management capability. The audit looks for:

  • An incident response plan.
  • Evidence of plan exercises.
  • Records of actual incidents handled.
  • Post-incident reviews with lessons captured.

The common gaps: the plan exists but hasn’t been exercised in 18+ months; actual incident records are thin (suggesting incidents either aren’t happening or aren’t being recorded); post-incident reviews are template completions without real learning.

The fix: regular tabletop exercises (covering both technical and decision-making, as discussed elsewhere), comprehensive incident logging including minor incidents, post-incident reviews that produce specific changes.

5. APRA notification thresholds are not operationalised

Paragraph 35 requires notification within 72 hours of awareness of material incidents. The audit looks for:

  • A documented threshold for APRA notification.
  • Records of notifications made.
  • Evidence that the threshold is operationalised — that staff know it and can apply it.

The common gaps: the threshold is documented in policy but not in operational playbooks; notifications haven’t been made (suggesting either no qualifying events, or events not properly classified); the threshold is interpreted by individuals without consistent application.

The fix: the threshold appears in the incident playbook with examples; the incident commander has explicit authority and obligation to declare a notifiable event; staff are trained on it.

6. The risk function and audit function are not visibly involved

Paragraph 32-37 require risk function and internal audit involvement in information security. The audit looks for:

  • Evidence of risk function review of the information security framework.
  • Evidence of internal audit review of information security controls.
  • Findings flowing into management reporting and remediation.

The common gaps: information security is treated as a technical-only domain, with limited risk function or audit involvement; reports exist but findings don’t clearly flow to remediation tracking.

The fix: risk function reviews the information security framework annually with a substantive report; internal audit covers information security controls in its plan with recurring coverage; findings are tracked through the standard governance reporting.

The audit-ready posture

An organisation that’s audit-ready against CPS 234 has, broadly:

  • Information asset register that’s current and used.
  • Information security policy framework that’s specific enough to drive practice.
  • Service provider arrangements with substantive contracts and ongoing monitoring.
  • Testing program with visible evidence.
  • Incident management with exercise history.
  • Risk and audit involvement with clear reporting.
  • A small number of named individuals who can answer specific questions about specific controls.

The audit-ready posture is not perfect. It’s defensible — the gaps that exist are documented and being remediated, with reasonable timeframes and ownership.

A practical first move

If you have a CPS 234 audit in the next 6–12 months: run a pre-audit gap assessment. The goal is to surface the findings before the auditor does, with time to remediate or at least articulate a remediation plan. Most audit findings are not surprises to the audited entity — they’re known gaps. Documenting them with a remediation plan converts a finding into a known issue being addressed, which is a different conversation with the auditor and the regulator.

The Security Posture Assessment is structured to produce this output. The deliverable maps to CPS 234 requirements directly, with findings sized for board and audit committee review and a prioritised remediation roadmap. Engagements specifically scoped as pre-audit are run to the same framework with adjusted timing.

Get started

Bring AI risk under board oversight in two weeks.

A thirty-minute discovery call costs nothing. We confirm fit, scope, and timing, then issue a fixed-fee statement of work within two business days.