Knowledge Foundations

What Underpins a Rigorous Accessibility Review

Accessibility review isn't a single skill or a checklist. It draws on four interconnected bodies of knowledge — and the most effective reviews hold all four in view at once.

Disability & Learner Variability

Understanding disability models, assistive technologies, and how barriers — physical, cognitive, sensory, situational — affect participation. Disability is often contextual, not fixed, and this shapes how barriers are identified and addressed.

Universal Design & UDL

Applying Universal Design for Learning — multiple means of Engagement, Representation, and Action & Expression — so inclusion is built into content rather than added after the fact. This requires understanding both the framework and its application to learning design.

Technical Accessibility Standards

Fluency in WCAG 2.1/2.2 and related ICT criteria — the technical baseline for web content, LMS platforms, documents, media, and digital learning objects. Standards fluency means being able to interpret and apply criteria in context, not just run automated checks.

Laws, Governance & Institutional Context

Understanding how accessibility legislation, procurement policy, and institutional governance interact. Technical findings only create lasting change when they connect to real accountability structures within an organization.

A practical hierarchy for the knowledge areas:
Use WCAG to audit the artifact. Use UDL to evaluate the learning experience. Use inclusive design to assess the system. Strong accessibility review holds all three lenses simultaneously — and situates findings within the institutional context in which they need to be acted on.

What distinguishes expertise from checklist compliance

Automated tools detect a subset of accessibility issues — heading structure, colour contrast, missing alt text. They cannot assess whether a learning experience is equitable, whether an assessment is accessible in its design, or whether the barriers that most affect a specific learner population have been addressed. The gap between "technically passes" and "actually works for everyone" is where human expertise is irreplaceable.

On the difference between accommodation and design:
Accommodation addresses an individual's need after the fact. Accessible design reduces the need for accommodation by anticipating learner variability from the outset. Neither eliminates the other, but organizations that rely primarily on accommodation have not yet addressed the underlying design problem.

Methodology

How a Rigorous Accessibility Review Works

Effective accessibility review is structured, iterative, and multidimensional. The stages below represent a comprehensive approach — in practice, scope and depth vary based on context, resources, and what's most needed. Select any stage to explore it.

FoundationalCollaborative

Before any review begins, it's essential to understand the learning environment, the learner population, and the institutional context. What content types are in scope? Which systems are involved — LMS, authoring tools, documents, media, assessments? Who are the learners, and what is already known about barriers they face? Scoping shapes the relevance and usefulness of everything that follows.

What this stage produces

  • Shared understanding of scope and boundaries
  • Learner population context including known barriers
  • Prioritized list of content and systems for review
Skipping or rushing this stage is one of the most common reasons accessibility reviews produce findings that can't be acted on — because they aren't connected to the actual context of the organization.
SystematicHigh-stakes

Barrier analysis looks across four dimensions — physical/motor, cognitive, sensory, and situational — to identify where learners encounter obstacles. A key distinction is between minor friction and task-blocking barriers: not all issues carry equal weight, and prioritization matters. Equally important is locating where the barrier lives: in the content, the technology, the workflow, or institutional policy. Barriers in the last two categories often go undetected by content-focused audits.

What this stage produces

  • Categorized barrier inventory by type and severity
  • Distinction between access blockers and usability friction
  • Identification of workflow and policy-level barriers
Multidisciplinary barrier analysis — involving educators, technologists, and learners with disabilities — consistently produces more actionable findings than single-discipline reviews.
WCAG 2.1/2.2Technical

Automated tools can rapidly detect a subset of issues — missing alt text, poor heading structure, colour contrast failures, ARIA errors. However, they can only identify a portion of real accessibility barriers. Manual testing using assistive technologies — screen readers, keyboard-only navigation, magnification tools — is essential for capturing what requires human judgment. Results should be interpreted in context, not simply reported as pass/fail.

What this stage produces

  • WCAG compliance mapping by criterion and content type
  • Screen reader and keyboard navigation findings
  • Colour contrast, visual structure, and readability analysis
Automated tools detect roughly 30–40% of real accessibility issues. A review that relies on automation alone will miss the majority of what affects learners in practice.
Human-centeredEvidence-based

Legal enforcement increasingly scrutinizes whether users can independently complete tasks on digital platforms — not just whether technical standards are met. Engaging learners with disabilities in structured evaluation surfaces the gap between "technically compliant" and "actually usable." This includes heuristic evaluation, task-based testing, and systematic documentation of feedback. Small, focused tests yield more actionable insights than large undifferentiated studies.

What this stage produces

  • Task completion analysis and friction mapping
  • Heuristic evaluation findings from disability-centered perspectives
  • Documented user feedback with systematic analysis
Experiences that meet technical compliance but remain unusable with screen readers have been frequently cited in accessibility complaints and legal settlements. Compliance and usability are not the same thing.
UDL-alignedLearning design

A technically accessible course can still exclude learners if its learning design is inflexible or its assessments assume a narrow range of abilities. This stage evaluates whether the learning experience offers multiple means of Engagement (the why), Representation (the what), and Action & Expression (the how). It also assesses whether assessment tasks are designed equitably — not just whether platforms pass a technical checklist.

What this stage produces

  • UDL alignment map across engagement, representation, and expression
  • Assessment equity analysis
  • Recommendations for flexible pathways and varied representation
Traditional assessment methods often conflict with UDL principles, favouring standardized formats that reflect performance under specific conditions rather than mastery of learning outcomes.
ActionableCapacity-building

Findings are only useful if they can be acted on. Effective reporting translates technical and learning design findings into a prioritized remediation roadmap — distinguishing quick wins from longer-term structural changes, and connecting recommendations to real governance, workflow, and procurement contexts. Reports serve multiple audiences: technical teams need specific guidance; learning designers need design principles; leadership needs strategic framing and risk context.

What this stage produces

  • Prioritized remediation roadmap with effort and impact framing
  • Differentiated outputs for technical, design, and leadership audiences
  • Connection of findings to governance, policy, and procurement
  • Framework for continuous improvement, not one-time remediation
Establishing a framework for continuous improvement — rather than treating accessibility as a one-time audit — is essential. Accessibility issues accumulate over time, and periodic review prevents the buildup of unresolved barriers.

Key Decisions

What Shapes an Accessibility Review

No two accessibility reviews look the same. The decisions an organization makes about scope, depth, timing, and involvement determine what kind of review is appropriate and what it will produce. Understanding these dimensions helps set realistic expectations — and make better use of the review process.

Scope

What is actually being reviewed?

Spot audit — a single course, module, document, or platform feature
Program review — a full learning pathway or credential
Institutional review — systems, policies, and practices organization-wide

Depth

How thorough does the review need to be?

Rapid assessment — key barriers identified efficiently for prioritization
Comprehensive audit — full WCAG, UDL, and user-centered evaluation
Ongoing review — iterative, embedded in the design cycle over time

Learner Involvement

Are learners with disabilities part of the evaluation?

Expert review only — specialist assessment without direct user testing
Expert + user testing — structured participation from disabled learners
Co-design — learners involved in shaping remediation and redesign

Timing in the Design Cycle

When does the review happen relative to content development?

Pre-development — informing design decisions before content is built
Pre-launch — catching issues before content goes live (least costly)
Post-launch remediation — addressing barriers in existing content

Reporting & Use of Findings

Who needs to act on the findings, and how?

Technical documentation — specific findings for designers and developers
Strategic summary — for leadership, governance, and procurement
Capacity building — sharing knowledge across the team through the review process itself

Focus Area

Where is the review primarily directed?

Digital content — web, LMS, documents, media, and assessments
Learning design — UDL alignment and assessment equity
Systems & governance — institutional policy, procurement, and workflows
A note on rapid vs. comprehensive approaches:
Rapid assessments trade completeness for speed and can be genuinely useful for prioritization — particularly when resources are constrained. The important thing is to be clear about what rapid assessment can and cannot detect, and to treat its findings as a starting point rather than a complete picture.

Learner Impact

What Changes — and When

Accessibility review creates change at two different horizons. The shorter-term outcomes are tangible and measurable. The longer-term shift is cultural — and has a much broader effect on how an organization designs and delivers learning.

Shorter Term

What becomes visible and fixable

  • Barriers are named and prioritizedA clear picture of what's blocking learners, organized by severity and what it would take to address.
  • Immediate issues are remediatedMany common barriers — missing captions, poor contrast, absent alt text, inaccessible documents — can be addressed quickly once identified.
  • Compliance is documentedA formal review and remediation roadmap provide evidence of due diligence for regulatory and governance purposes.
  • Team knowledge growsDesigners, developers, and faculty gain practical accessibility literacy through the review process itself.
Longer Term

What becomes embedded in practice

  • Accessibility becomes a design habitRather than a retrofit, accessible design is incorporated from the beginning — reducing future remediation costs significantly.
  • All learners benefitBetter structure, clearer language, flexible formats, and consistent navigation improve the experience for every learner — not only those with disabilities.
  • Institutional culture shiftsThe focus moves from individual accommodation toward proactive inclusion — affecting procurement decisions, faculty development, and organizational identity.
  • Governance and procurement improveOrganizations develop the institutional literacy to require and evaluate accessibility in vendor selection and new platform adoption.
On the evidence for accessibility and learning outcomes:
Research consistently shows that accessibility improvements support retention, participation, and equity — not just compliance. The strongest case for accessibility investment is not the legal argument alone, but the evidence that inclusive design produces better learning environments for all learners.

Legal & Regulatory Context

The Legislative Landscape in Canada and BC

Understanding the legal context helps organizations make sense of their obligations, the gaps in current enforcement, and why the direction of travel matters even where specific technical requirements are still developing.

The gap between legislation and practice

Both the ACA and the Accessible BC Act have been critiqued for the absence of specific enforceable technical standards and clear timelines. In practice, organizations currently have significant latitude in how they interpret their obligations — but the regulatory direction is clearly toward greater specificity over time.

Understanding this landscape helps organizations make informed decisions about their accessibility practice — not just about minimum compliance, but about where the field is heading and how to build genuinely inclusive learning environments within that context.

On the relationship between legal compliance and genuine accessibility:
Meeting current legal requirements does not guarantee that learning environments are genuinely accessible. The gap between minimum compliance and meaningful inclusion is where most of the work in educational accessibility actually lives. Legal frameworks provide a floor, not a ceiling.

Reflection Tool

Where Is Your Organization in This Work?

This short reflection is designed to help you think through your organization's current relationship with accessibility — not to score or evaluate you, but to surface useful questions and help you identify where the most productive focus might be.

There are no right or wrong answers here. Select the response that most honestly reflects your current situation. The reflection takes about two minutes and ends with some questions worth sitting with.

Sources & Further Reading

The Scholarship This Work Draws On

The frameworks, methodology, and evidence base in this resource are grounded in the following sources, organized by the knowledge area they most directly support.

Universal Design for Learning

CAST. (Ongoing, latest 2024 edition). UDL Guidelines.

udlguidelines.cast.org

The foundational framework providing principles for flexible learning environments that address learner variability. The primary scholarly-practice bridge for inclusive learning design and the basis for evaluating whether learning experiences offer multiple means of engagement, representation, and action and expression.

Ok, M. W., et al. (2023). The effectiveness of universal design for learning: A systematic review. Cogent Education, Taylor & Francis.

A meta-analysis confirming UDL's positive impact on diverse learners across educational settings. Useful when making the evidence-based case for proactive inclusive design over accommodation-based approaches.

Al-Azawei, A., Serenelli, F., & Lundqvist, K. (2016). Universal design for learning: A content analysis of peer-reviewed journal papers from 2012 to 2015. Journal of the Scholarship of Teaching and Learning.

Synthesizes UDL research evidence across the literature, useful for understanding the breadth of application and the state of the field as it matured into mainstream instructional practice.

Scalise, K., & Gifford, B. (2019). Inclusion, universal design and universal design for learning. Prospects (PMC).

Connects universal design and UDL to systemic inclusivity in higher education, grounding the frameworks in broader institutional and equity contexts rather than treating them as purely instructional tools.

Inclusive Design

Watkins, C., Treviranus, J., & Roberts, V. (2020). Inclusive Design for Learning. Inclusive Design Research Centre and Commonwealth of Learning.

Outlines practices for creating adaptable digital learning content. Complements both UDL and WCAG by situating inclusive design within broader questions of open education, content flexibility, and equitable access across diverse learner populations and contexts.

Accessible Course & Platform Design

Bederson, B. B., & Meyer, D. E. (2017). Inclusive Design for Online and Blended Courses. ERIC.

Links WCAG technical standards with UDL principles specifically in the context of online and blended learning. Practically useful for reviews that span both the technical audit and the learning design assessment stages.

Professional Standards & Audit Framework

International Association of Accessibility Professionals (IAAP). CPACC Body of Knowledge.

accessibilityassociation.org

The clearest cross-disciplinary foundation for accessibility review competency, integrating disability knowledge, universal design, and management systems in a single framework. For technical review, pair with the WAS Body of Knowledge and WCAG 2.1/2.2.

A practical note on using these sources together:
Use UDL to evaluate whether learning experiences offer genuine multiple means of engagement, representation, and expression. Use WCAG to audit the digital content and platforms. Use inclusive design scholarship to assess the organization's systems and processes. For reporting to leadership, pair standards-based findings with the evidence base showing that accessibility supports retention, participation, and equity — not only legal compliance.