Nullvariable

Is Usability Getting Better?

by Doug on March 2, 2010

Usability refers to how well people can use something. It can be anything from a sign on a door letting you know which direction the door opens (making it more likely you’ll be able to open it on the first try) to arranging the navigation on a website in a logical fashion.

Recently Jakob Nielsen posted about Progress in Usability, specifically with regard to websites. He comes to several interesting conculsions, and states that it will take us 74 years to reach six-sigma quality.

I have a couple of problems with his logic and data. First, he uses data from 262 websites that his company has collect formal usability metrics for. He doesn’t state at what phase those metrics were collected. It’s not clear if the data is collected before or after the sites have been tested or if there was any work done to improve usability prior to collecting these metrics. I’m assuming that these metrics are being collected prior to optimizing these sites. What I’d like to know is how much improvement was able to be made across these sites. Just because the default out of the box design wasn’t 100% usable doesn’t mean that subsequent generations were only improved by 6%.

Another area that Nielsen ignores is that users are getting smarter. We know that users learn and adapt over time. Are websites getting more complex and staying “un-usable” to users that are more advanced? I doubt it. So Nielsen’s number could actually be a lot worse since most of us have been using the Internet long enough to figure out a few things along the way and be more likely to successfully accomplish our goal. Or it could be that my next point is causing his data to be wrong.

My final problem with the assumptions Nielsen reaches is that he’s only got 262 data points that he’s comparing. That’s not even remotely close to the number of websites on the internet with Netcraft reporting that it touched 206 billion websites in January 2010. (In 1995 there were 18000 websites, in 2006 we hit 100 million.) The Internet has exploded in it’s size and yet Nielsen’s data covers a mere 26 websites a year. It’s hard for me to believe that we can judge usability progress accurately with such a tiny window into the massive amount of data that is the internet. I have to think that these are skewed numbers without increasing the number of sites measured proportionally each year.

Nielsen even admits that he left out 15 major websites that they conducted recent studies on because he didn’t want to bias the data. While I agree I still believe that his data is too biased to come to the grand statements he’s making.

Perhaps I misunderstand what Nielsen is trying to say but I can’t help but think two things, his slice of the pie is really too tiny to hold up against the whole of the internet and that we’re not seeing the important numbers (how much a site can improve just by working at it). I don’t know of any other six-sigma measured processes that improve with zero effort.

I believe that education is the biggest factor in improving things like this, what do you think? Did I miss something with Neilsen’s data? What tactics do you use to increase usability for your site so you can stand out among 206 billion websites?

Be Sociable, Share!
  • http://twitter.com/techherding Dick Carlson

    I'd also add that “correlation does not imply causation”. Ten years ago, I'd spend quite a bit of time trying to figure out how to find something on a web site — maybe ten minutes. Today, after a few seconds, I'll give up and go somewhere else. So the “bar” for usability is much higher.

    That said, I still think Jakob is a God. I've learned more about web design and testing from him than anyone else on the planet.

  • http://www.nullvariable.com nullvariable

    I totally agree, he's a very sharp guy just don't think he considered all the angles here.

Previous post:

Next post: