Accessibility Testing Results

This project went through several rounds of accessibility testing to make sure each page works well for different types of users. The goal wasn’t just to pass automated tools, but to understand how real people with different needs might interact with the site. This page explains the tools that were used, the manual checks that were performed, and what the results showed about the overall accessibility of the project.

Automated Testing Tools Used

Automated tools helped catch issues quickly and gave a clear starting point for improvements. These tools were used throughout the project to verify structure, contrast, and accessibility features.

WAVE Web Accessibility Evaluation Tool

WAVE was used on every page to check for missing alt text, contrast problems, heading structure issues, and ARIA errors. It also helped confirm that interactive elements were labeled correctly and that the layout didn’t create barriers for screen readers.

W3C HTML Validator

The HTML Validator ensured that each page followed proper structure. Valid HTML helps assistive technologies read the page correctly, so this step was important for keeping everything consistent and predictable.

Color Contrast Analyzer

This tool checked the contrast between text and background colors in both light mode and dark mode. It helped confirm that the color choices met WCAG AA contrast requirements, which is important for users with low vision or color sensitivity.

Browser DevTools Accessibility Panel

The built‑in accessibility panel in the browser was used to inspect roles, names, and the computed accessibility tree. This helped verify that screen readers would understand the purpose of each element and that nothing was mislabeled or missing.

Manual Testing Performed

Automated tools are helpful, but they don’t catch everything. Manual testing made sure the site actually feels usable and predictable for real users.

Keyboard Navigation

Each page was tested using only the keyboard. This included tabbing through links, buttons, form fields, and interactive controls. The goal was to make sure the tab order made sense and that nothing required a mouse to operate.

Focus Visibility

Focus indicators were checked in both light and dark modes. Clear focus outlines help users know where they are on the page, especially when navigating forms or tables.

Screen Reader Testing

Pages were tested with a screen reader to confirm that headings were announced correctly, labels matched their inputs, and the table structure was read in a logical way. This also helped verify that the form’s required fields and error messages were announced properly.

Responsive Behavior

The site was viewed on smaller screens to make sure the layout didn’t break. Tables, forms, and navigation were checked to confirm they stayed readable and easy to use on mobile devices.

Summary of Findings

Overall, the testing showed that the site is accessible, consistent, and easy to navigate. Here are the main results:

Final Notes

Accessibility isn’t something that gets “finished.” It’s something that improves over time as new tools, devices, and user needs appear. This project reflects the current state of the site and shows a strong understanding of accessible design. Future updates could include more advanced testing, user feedback, or additional features to support even more types of users.