
An institutional website audit is not the SEO-and-marketing audit that consumer websites typically receive. It is a structured assessment across five dimensions that matter to public-sector institutions: performance, accessibility, security, content integrity, and compliance posture. For government agencies, universities, healthcare institutions, and nonprofits, the audit produces a documented baseline that drives operational decisions and supplies evidence for stakeholders who care about institutional risk. This post is the operational framework.
We covered the broader operational pattern across CMS platforms in our performance and security posts. This post is about the audit itself: what gets evaluated, how it gets evaluated, and what the institution does with the result.
The Five Audit Dimensions That Matter
The institutional website audit covers five dimensions. A single-dimension audit (just performance, just accessibility, just security) misses interactions between dimensions that matter operationally.
Performance
What gets evaluated: Core Web Vitals across representative pages, time-to-first-byte from realistic user locations, asset weight, render-blocking resources, third-party script impact, server response times under load. Tools include Lighthouse, WebPageTest, AWS CloudWatch RUM, and synthetic monitoring against a representative URL set.
What "passing" looks like: LCP under 2.5 seconds, INP under 200 milliseconds, CLS under 0.1, TTFB under 800 milliseconds. Performance budget documented and tracked.
Accessibility
What gets evaluated: WCAG 2.1 AA conformance across representative templates, keyboard navigation flows, screen-reader compatibility, color contrast, alt-text coverage, form-label associations, video caption availability, and PDF accessibility. Tools include axe-core, WAVE, manual screen-reader testing (NVDA, JAWS, VoiceOver), and keyboard-only navigation walkthroughs.
What "passing" looks like for institutional sites: WCAG 2.1 AA conformance documented, Title II (for government), Section 508 (for federal), and institutional accessibility statement published. The audit produces a remediation backlog with priority and timeline.
Security
What gets evaluated: TLS configuration, security headers (CSP, HSTS, X-Frame-Options, Referrer-Policy), authentication surface posture, plugin and theme vulnerability inventory, exposed file inventory (admin paths, configuration files, backups), and external scan results. Tools include SSLLabs, Mozilla Observatory, OWASP ZAP, institutional vulnerability scanners.
What "passing" looks like: A grade or higher on SSLLabs, B+ or higher on Mozilla Observatory, no high-severity findings in OWASP scan, current plugin and theme inventory with no known unpatched CVEs, institutional WAF deployed.
Content Integrity
What gets evaluated: broken internal and external links, image rendering across templates, form-submission paths, search functionality, content currency (institutional pages with stale dates), and orphaned pages (pages not linked from navigation but still indexed). Tools include Screaming Frog, Sitebulb, manual content-team walkthrough.
What "passing" looks like: zero broken links on critical paths, content currency documented per content type, institutional content review process visible.
Compliance Posture
What gets evaluated: privacy policy currency and accuracy, cookie disclosure compliance (where applicable), institutional data handling alignment with stated policies, accessibility statement, terms of use, Section 508 (for federal), state-level public-sector requirements (where applicable). For institutional sites with regulated data adjacencies (health information, education records, financial information), the relevant framework alignment is part of the audit.
What "passing" looks like: institutional governance documents current, posted, and aligned with actual practice.
What the Audit Produces
The audit produces three artifacts that institutions use:
Findings register. A documented list of findings with severity, dimension, affected pages or templates, evidence (screenshots, scan output), and remediation guidance. The findings register is the working document for the remediation effort.
Baseline report. A executive-readable summary of the institutional site's posture across the five dimensions, with comparison to institutional targets and recommendation for the remediation roadmap.
Remediation roadmap. A prioritized plan for addressing findings, with effort estimates, dependencies, and target completion dates. The roadmap is institutional, not technical-only: it includes content-team work, design-team work, IT-team work, and external-vendor work as appropriate.
The audit is a moment-in-time snapshot. Mature institutional websites are audited on cadence (annually for full audit, quarterly for performance and security spot-checks) so the baseline stays current.
Who Conducts the Audit
The institutional pattern that holds:
Internal audit by IT or web team. For institutions with mature internal operations, the audit is conducted internally with appropriate tooling. The output is institutional-owned and folded into ongoing operational discipline.
Third-party audit by managed-services provider. For institutions using a managed WebOps partner, the audit is part of the engagement. The third-party perspective surfaces issues that internal staff become blind to over time.
Specialized audit by accessibility, security, or compliance specialists. For institutions with specific compliance requirements (Section 508 for federal, HIPAA for healthcare adjacencies, state-level requirements), the relevant dimension is sometimes audited by a specialist firm. The specialist audit complements rather than replaces the broader institutional audit.
Combination. Most institutional patterns combine internal continuous monitoring with periodic external audit. The continuous monitoring catches drift; the periodic external audit catches blind spots.
What an Audit Is Not
The institutional website audit is not:
An SEO audit. Search-engine visibility is sometimes part of the audit but is not the central concern for most institutional sites. Institutions whose primary mission is reach (some nonprofit campaign sites, some institutional public-information programs) have SEO as a real concern. Most institutional sites do not.
A redesign justification. The audit produces a remediation roadmap, not a new visual identity. Institutional redesigns happen on their own cadence and are informed by audit findings but not driven by them.
A one-time event. A single audit is a snapshot. The value comes from cadenced audit, ongoing monitoring, and operational discipline that maintains the posture between audits.
A consultant deliverable that sits on a shelf. The findings register and remediation roadmap are working documents. If they are not being worked, the audit was waste.
What Mature Institutional Websites Look Like After Audit
Institutional sites with mature audit practice share visible characteristics: documented performance budget with current measurement, WCAG 2.1 AA conformance maintained, security posture aligned to institutional baseline, content review cadence visible, governance documents current, and remediation work tracked against the roadmap.
For managed WebOps engagements supporting institutional sites, the audit is the entry point and the recurring touchpoint. The audit identifies what needs to happen; the WebOps engagement makes it happen.
Frequently Asked Questions
How long does an institutional website audit take?
For a single institutional site of moderate complexity: two to four weeks for the full five-dimension audit, including technical scanning, manual review, and report production. For an institution with multiple sites, the audit can be batched with shared scanning infrastructure.
What is the typical cost of an institutional website audit?
It varies. For a managed-services engagement that includes audit, the audit is often included as the engagement entry point. For standalone third-party audits, the range is wide: $5,000 to $50,000 depending on site complexity, dimension coverage, and specialist accessibility or security review depth.
How often should institutional sites be audited?
The pattern that holds: full audit annually, performance and security spot-checks quarterly, content integrity continuous through institutional content workflow. After major site changes (redesign, CMS migration, hosting migration), the audit is repeated.
What is the difference between an audit and a penetration test?
An audit is a structured posture assessment across multiple dimensions. A penetration test is a security-specific exercise where a security team attempts to compromise the site. For institutional sites with regulated data or high public-sector visibility, both are part of the security posture: audit catches the posture issues, pen test catches the exploit chains.