I recently received an email telling me that my website had the least accessibility errors in a pool of dozens of other accessibility experts or firms’ websites. This was interesting. And yet there were several things I disputed.
In “Accessibility guru, heal thy site”, Jonathan Pool describes the tests he ran, and the results. Pool uses a new automated testing suite called “Autotest”. This testing prototype tests for a whole bunch of things that the big accessibility testing tools, such as axe-core, or WAVE , appear not to test.
It tests things such as movement on a page, 3-D layering, deprecated roles, nonstandard navigation, and several other things.
I think this is an interesting approach, but it’s far from ready fror prime time.
Results for incl.ca
The tests found a couple things actually broken on the page:
- Skip link not working
- A <fieldset> without a <legend>
Both of those came from an update to CSS styling that came through via CDN. I did not know it had happened and I had no direct control over it. I fixed it by manually changing the class name. Moral of the story: Don’t rely on 3rd party stuff without regularly checking!
- One test shows a missing label for an element with a complementary role. This is flagging my language switcher that uses <aside>, which has an implied role of complementary. I don’t believe it’s a major issue by any stretch of the imagination. But I’ll see about fixing it for the sake of fixing it in the coming weeks.
- The “log” test is the one that gave the highest failure points. Mr. Pool told me in a follow up email that this had to do with missing fonts creating errors. I dispute that this is an actual accessibility problem. Mr. Pool stated that if it was a dyslexia friendly font and it wasn’t loading, it would indeed be a problem. Except that dyslexia friendly fonts are not an actual thing.
- There is a zIndex stacking error. But I’m using z-index to expose the skip link on focus. While stacking could cause problems in some cases, in this situation, it is not.
- Finally, the testing tool picked up style differences. My analysis does not indicate these cause actual problems for disabled users.
It was interesting to see that my homepage ranked best from dozens of other accessibility experts or accessibility firms. My site also happens to be fairly simple.
It was no surprise to see overlay providers ranking rather low in these tests.
But automated testing will only find some of the errors on a page. And of the errors found, they need to be considered, analyzed, and evaluated. Not all issues automatically found by automated tools are actual problems. Then there’s the bias of the tool maker, weighing different items differently, giving more, or less, importance to results.