Plain-English Security Reports: What a Real Finding Looks Like
A security audit report should read like a human wrote it. Here's what a P23 deliverable contains, how risk ratings work, and why jargon is a red flag.
If you need a translator, it’s the wrong report.
There is a category of security report that reads like it was written by lawyers, for lawyers, about a building lawyers have never entered. Dense acronyms. Passive voice. Risk matrices that look serious and say almost nothing. Twelve pages of appendix template pulled from another engagement.
That is not what an audit report should be. An audit report should read like a competent human wrote it to another competent human who happens to run the organization being audited.
Plain-English writing is not about being simple. It is about being clear. Clear writing makes clear thinking visible. That is what a report is for.
The structure we use.
Every P23 audit report follows the same structure. The consistency is deliberate. An organization that receives multiple reports over time should be able to compare them easily, and a reader new to the document should find the structure familiar.
1. Executive summary
One to three pages. The most important findings, the overall risk posture, the priorities for action. Written so that a reader who never opens any other section walks away with the core message.
2. Scope and methodology
What was reviewed. Who was interviewed. What was not reviewed and why. This section protects both the reader and the auditor by making the engagement’s boundaries explicit.
3. Findings
The heart of the document. Every finding follows the same structure:
- Finding statement: a single sentence describing what was observed
- Observation detail: what we saw, where, when, and how
- Risk rating: high, moderate, or low, with the reasoning for that rating
- Recommendation: a concrete, actionable step to address the finding
- Priority: 30-day, 60-day, or 90-day
Findings are grouped by domain (physical, policy, personnel, technology, training) and ranked within each domain by risk.
4. Recommendations and action plan
The 30/60/90-day action plan consolidates the recommendations in a single implementation view. It tells leadership what to do first, what to do next, and what to plan for later.
5. Appendices
Maps, photographs, specific policy language where relevant, interview summary where confidentiality allows. The appendices support the main narrative without cluttering it.
What a finding actually looks like.
The best way to demonstrate the standard is to show one. Here is a representative finding from a generic audit, written the way P23 writes findings:
Notice what the finding does. It describes what was seen, not a theoretical concern. It rates the risk and explains the rating. It recommends specific action, not a vague principle. It assigns a priority. A reader can do something with it.
A finding that reads “The facility has inadequate access control” is not a finding. It is a complaint. A finding should be specific enough that a leadership team could delegate it to a specific person with a specific deadline.
The risk rating, and why we only use three.
Some audit firms use five-tier or seven-tier risk matrices with percentages and impact scores. We do not.
Three tiers are what humans actually use to make decisions.
- High: Fix this. Now or soon. The finding, if left uncorrected, is plausibly contributing to a serious incident. The recommendation should go into the 30-day bucket.
- Moderate: Address this. Material concern, not immediately dangerous. 60 to 90 day window, depending on complexity and cost.
- Low: Note this. Worth knowing about, worth fixing when convenient, does not require scheduled priority.
Every rating includes a sentence or two of reasoning. The reasoning is as important as the rating itself. It lets the reader argue with the rating if they disagree, and it documents how we reached the judgment.
Jargon, and why it matters.
Security jargon has a purpose. Inside the discipline, terms like “layered defense,” “single point of failure,” “defense in depth,” and “human factors” communicate quickly. The problem starts when jargon leaves the discipline and enters a report written for people outside it.
A few specific rules we follow:
- Every technical term gets a plain-English paraphrase on first use
- Acronyms are expanded on first use, every time, without apology
- Passive voice is avoided except where it genuinely clarifies
- We do not use “security theater,” “threat surface,” or “attack vector” without translation
- We do not cite NIST, CISA, or ISO frameworks without explaining what they are and why they matter here
The rule behind the rules: the reader is intelligent but not specialist. Write to honor both facts.
What the Hurricane Ian reviews taught us.
After Hurricane Ian in 2022, we worked with several Southwest Florida organizations on post-event reviews. The reports that proved most useful in the recovery period were not the longest or most technical. They were the ones that had been written clearly enough for the executive director to hand to the board, the board to hand to the pastor, the pastor to hand to the safety team lead.
A well-written report travels inside the organization. A poorly-written report sits in a drawer. When the next event arrives, the difference is measurable.
A good report respects the intelligence of the reader by giving them information they can actually use. That is the standard. Anything less is the auditor’s ego getting in the way of the work.
How leadership should read the report.
A report is a tool. Using it well takes a small discipline.
- Read the executive summary first. Resist the urge to flip to specific findings before the overview is clear.
- Read findings by priority, not by page order. Start with the 30-day items.
- Take notes in the margin. The report is yours. Mark the findings you have questions about.
- Schedule a walkthrough with the auditor to discuss findings, if that is not already included. The conversation produces more than the document.
- Assign each finding to a specific person with a specific deadline. An unassigned finding is a finding that will not close.
- Schedule a follow-up review in 90 days. Track closure. The audit is an event, the program is ongoing.
What to ask for before you hire an auditor.
If you are evaluating a security auditor, ask to see a redacted sample report. The sample will tell you more than any sales call. Look for:
- Specificity in findings (not general principles)
- Clarity in language (no gratuitous jargon)
- Structured risk ratings with reasoning
- A real 30/60/90-day action plan
- Appropriate length (complete but not padded)
- A consistent voice throughout (not obviously stitched from a template)
A firm that cannot show you a representative sample, or whose sample reads like a template filled with your organization’s name, will produce the same for you.
Reports are the visible part of the work.
The walkthrough, the interviews, the policy review, the technology inspection all happen before the report. The report is what the organization is left with. It is the tool that carries the audit forward, into action, into the next conversation with leadership, into the decisions that will or will not get made.
Writing it well is part of the work. At P23 Security, we treat it that way. Every report is written, reviewed, and sharpened until it reads like what it is: a clear-headed, honest assessment of what we found and what to do about it.
If that is the report you want for your organization in Fort Myers, Cape Coral, Naples, or Port Charlotte, we would be glad to start the conversation.
Ready when you are
An honest audit, written the way a human writes.
Flat-rate. Plain-English report. 30/60/90-day action plan. We audit. You decide.
Request a flat-rate auditRelated Insights
Keep reading.
The 30/60/90-Day Security Action Plan Explained
A good audit ends with a 30/60/90-day plan. Here's how P23 decides what goes where, why pacing matters, and how to use the plan with leadership.
Alarms, Access Control, and the Technology Posture Review
Your alarm and access control only work if they are configured, tested, and actively managed. Here's what a technology posture review actually checks.
How an Annual Audit Fits Inside an fDoS Engagement
The annual audit inside a fractional Director of Security engagement is more efficient, more focused, and more comparative than a one-time audit. Here's why.