A marketing email from Loop11, the maker of online usability testing software, did the rounds among local government IT managers, marketing teams and mailing groups last week claiming in its subject that "City Of Melbourne website ranks 2nd in Australia."
The email, offering a usability case study on six of Australia's capital city websites, got passed around some of the execs/managers at my day job with much interest, then passed to me for evaluation - well, actually it was more of a "TL;DR. Why aren't we on top of this list?" (We're not a capital city for starters. :-))
On reading the published report, there were a few things that immediately stood out for me. First of all, this post is not intended as a rant, merely an observation and commentary on both the marketing aspect of the report and the user testing process. Loop11 looks like a great tool, and this exercise has most likely drummed up quite a bit of interest in their product as well as the idea of user testing in general.
There is some obvious link-baiting going on in the email subject. Not to detract at all from Melbourne's great website (of which I'm a fan), they came second out of six captial city websites, not exactly "2nd in Australia" as mentioned in the email's subject. There's no doubt that the sole purpose of the report was an attention-grabbing marketing exercise for Loop11's user testing software. That's not necessarily a bad thing, however the test itself seems rushed and inconclusive.
The six council websites were tested by 600 random world-wide internet users to complete one single task (100 testers per website) recruited from Amazon's Mechanical Turk service which, to me, sounds like the internet equivalent of a sweat shop. The testers were paid 5 cents each for completing the task.
When conducting your own testing, you would be better to test a much smaller selection of people. Usability guru Jakob Nielsen believes that 85% of problems can be found with only 5 users (and a follow-up test with 5 more users should pick up most of the remaining problems.) You would also likely offer higher compensation (such as a free cinema ticket) to get a better buy-in from your participants.
The task the testers had to complete was "Find out what day your household waste is collected". So in the end, the result is less "most usable website" and more "most prominent waste collection link". In a real-world test scenario, you would obviously test more website functions across a number of council services.
As I said before, I actually really like Loop11's software and think it can be really beneficial when performed adequately, however the price tag of $350USD per test ended up being a bit of a sore point for us. Being a developer of web applications who enjoys a challenge, it has definitely given me a bit to think about regarding perhaps developing my own in-house user testing application in the future.