Search This Blog

Tech Book Face Off: Prioritizing Web Usability Vs. Don't Make Me Think

According to Netcraft.com, there are over 185 million active websites using over 630 million hostnames on the internet as of March 2013. That is a lot of websites, and I would be willing to bet that the vast majority of them are poorly designed and not very user friendly. I've definitely seen my fair share of them. When I find especially well designed sites, they can be a pleasure to use. Some of them are truly entertaining while others have such an elegant user interface that you hardly even notice the complexity they contain. A well designed site will take no more than seconds to figure out, with everything you would like to do on the site being intuitively obvious. You would think such great designs would be more common, seeing as they are in plain sight for every web user to see, but alas, it would seem that a lot of web designers out there are not paying very close attention. It's not too difficult to make some big improvements in your web design, and there are a number of books out there that can point out the obvious, some better than others. Here are two:

Prioritizing Web Usability front cover VS. Don't Make Me Think: A Common Sense Approach to Web Usability, 2nd Edition front cover

Prioritizing Web Usability


Dr. Jakob Nielsen is the guru of web usability. He has been studying, writing, and advocating for better web usability for decades, and his advice is generally accepted as truth when it comes to improving websites. If he is anything, he is thorough. At 432 pages, Prioritizing Web Usability is essentially the reference on website user interface design. I actually made the mistake of also reading his Designing Web Usability as well, but you shouldn't. It's another 432 page reference (is there some meaning in 432?), only from 7 years earlier, and it's pretty much obsolete.

Dr. Nielsen does cover a lot of ground, including a very interesting review of how the web has improved since Designing Web Usability, but I couldn't help but feel like his explanations were excessively drawn out. By the time I reached the chapter on "Writing for the Web" that contained the advice of cutting down the prose on websites to the bare necessities, I was thinking a fair amount of cutting might have improved the readability of this book. The authors did try to make the argument that the web required succinct writing while print did not because it was the nature of the medium. I was not convinced. Then I read Don't Make Me Think, with the advice to cut half the content and then cut half of what's left, and I thought even more that there's no reason that shouldn't apply to print as well. Cut anything that doesn't directly add to the reader's understanding of what you are saying. They will thank you for it!

As it stands, I ended up only reading the prose and skipping the picture captions. This method probably cut about a third of the length from the book, and a fair amount of the redundancy. After finishing, I figured I could have gotten just as much out of it by only reading the captions and skimming the prose for any other keen insights. Don't misunderstand me, this book contains a wealth of knowledge on how to build a better website, and it's all clearly explained with good examples. The problem is that the key advice is watered down with a lot of extra points and wordy arguments to back up the reasoning. A book that was half the size with more focused advice would have ended up being much stronger, as we will soon see.

Don't Make Me Think


At 216 pages, Steve Krug's book is actually exactly half the size of Prioritizing Web Usability. Now I'm wondering if that was a coincidence. Steve's writing is clear, concise, and entertaining. The layout of the book is excellent, with great examples of websites that are trimmed to show only the point that he's trying to get across. It's so well done that it will only take you a few hours to read through it, and it will open your mind. Once you are aware of the issues, you'll see them in every website you visit, good or bad. That awareness alone will make you a better designer.

Even my wife read through this book and enjoyed it, and she's not a technical person. Steve shows you what you need to know, and he does it in a way that is accessible to everyone. That is the sign of a great teacher, and a great book. My wife was so impressed with it that she went on and read Rocket Surgery Made Easy, which goes in depth on usability testing. She liked it just as much as Don't Make Me Think. I haven't read it yet, but I'm planning to. I don't have much else to say, except, you should read Don't Make Me Think. Get the print version. The layout and full color content of the book lends itself much better to print. You can also check out Steve's website for more great advice.

A Little Knowledge Is A Dangerous Thing


If I had to pick one idea from these books that could really make you a better web designer, it's that your intuitions about good web design are wrong. If you are designing websites, your technical expertise will get in the way of usability because what is obvious to you is not obvious to the majority of your users. On top of that, you know too much about your website, and you'll naturally organize it in a way that makes sense to you. But your users aren't coming to your website to learn everything you know. They're either coming to get something done, or to learn what they need to know. To help your users accomplish their goals, your site needs to be simple, clear, and obvious. These books can help, but the best way to achieve that is to get to know your users. You're designing for them, are you not?

So what to read?

If you haven't already decided, let me give one more perspective. My wife also tried reading Designing Web Usability. She couldn't get past the first 100 pages, and she was constantly asking me questions about what Dr. Nielsen was talking about, since she's a non-technical person. That experience has turned her off to reading Prioritizing Web Usability. I think she'll get around to it at some point, but she keeps putting it off. With Krug's books she could read straight through, understanding everything. Her recommendation is that everyone, whether they are design engineers at a huge corporation or Johnny down the street setting up a personal website, can gain a lot from reading Steve Krug's books. If you are a technical person with lots of time on your hands, or if you are searching for encyclopedic knowledge of web usability, then reading Nielson's book would get you there.

The Evolution of a Software Program

Analogies, anecdotes, and parables are all incredibly powerful literary devices that can be used to deepen your understanding of a complex topic and increase your ability to recall more details about it when you need to. When exposed to a new idea, your brain needs something for the new idea to hold onto, otherwise it takes a lot of repetition to lodge it into your long term memory. These devices work by relating the new idea to something you already know so that you can make associations between this weird new thing that you're trying to plant in your mind and something familiar that already has strong roots in your memory. The stronger the connections, the faster you can understand the new idea, and the longer you're going to remember it.

Beyond the mental connections that they make, analogies, anecdotes, and parables all tell stories about what the writer is trying to convey to the reader, instead of droning on with technical prose that will most likely put the reader to sleep. Stories show you what the writer is conveying rather than tell you what you should know. It's a far more interesting way to learn something, and you are much more likely to stay engaged and actually absorb the ideas when you're interested in what you're reading. I felt like I got so much out of The Pragmatic Programmer, because of the authors' use of stories to make their points.

I especially like analogies for enhancing understanding because they are so easy to find in everyday objects and activities. Both of the books I talked about last week made analogies between software development and other endeavors. Andy and Dave likened it to gardening in The Pragmatic Programmer, and Steve related it to building construction in Code Complete 2. Other analogies abound; just do a Google search for what software development is like for a little light reading. I think the variety and sheer volume of associations is excellent. Anything that helps you make new insights and become a better programmer is a plus in my book.

One way to think about software development that I would like to explore is that it is like evolution. Much has been written about software evolution, as in how a software system evolves over time, or the evolutionary model of software development. Those ideas are definitely part of what I'm talking about, but I want to go further and compare software development directly with the theory of evolution. Let's start with some simple associations and see how far we can stretch this analogy.

In the beginning, the theory goes, there was the primordial soup. In it were various organic compounds that had resulted from chemical reactions in the atmosphere. Over time these compounds reacted and combined into more complex polymers and amino acids, the building blocks of life. In the same way, the building blocks of a program are the programming language, libraries, and frameworks that are used to create it. You could go really basic with the building blocks and think of them as the assembly language instructions, or even the machine code that makes up a program, but then we would be describing the evolution of software development in general from machine code to assembly language to higher level languages and abstractions. I think that's a valid association, but I want to look at the evolution of an individual program written in a particular language. It could be any language, but the language is not what's evolving.

Every program has to start somewhere. A simple little proof-of-concept program that takes some input and produces some output is like a single-celled organism, or bacterium, that consumes fuel and produces waste. Okay, maybe 'waste' isn't the best association for a program's output. How about by-product? No, that's not much better. The point is that our little bacterium takes inputs from its environment and converts them in some way to produce something that it emits back into its environment - just like a barely useful initial program does.

From this basic program, the designer can add code to increase its capabilities, either by added features or improving performance. These improvements are the mutations in the program's evolution. Some mutations will be successful and will be kept, saved in the historic record of the version control system. You are using version control, right? Other mutations won't work so well, and will be discarded as cancelled features or mutated further into bug fixes.

As the program grows and evolves, it will become increasingly complex and interconnected, with newer systems built on top of older systems, just like the human brain is built up from older nervous systems, starting with the spinal cord and brain stem, then adding the limbic system, cerebellum, and finally, the cerebrum. Often times code loses its usefulness, but it remains in the program, Or it becomes redundant, but it isn't optimized out. We see similar examples in our own evolutionary history with many redundant parts, and other parts that we can't quite explain.

The compiled code of a program is similar to an organism's DNA. It can be replicated easily, and the compiled code is an exact description of how to create an instance of a running program on a computer. The code can be copied from one computer to another through various methods, and there we have the equivalent of reproduction. Each instance of a running program can vary based on its configuration and unique inputs, which could equate to variations in an organism's DNA and the environment it grows up in.

Evolutionary history also goes through cycles of dramatic explosions of mutations and rapid evolutionary progress followed by periods of relative calm and stability that end with mass extinctions. Software development cycles follow a similar pattern with periods marked by rapid evolution of the code base followed by slower periods when programmers take a rest and hopefully go on vacation. Mass extinctions can be thought of as happening at the release of a new software version when it replaces much of the installed base of the old version, sometimes more effectively and sometimes less.

To really be a useful analogy, evolution should lead us to new insights about software development, and I think it does. Starting with a small proof-of-concept program and then building on it, growing it into something more useful and complex with each iteration, is a highly effective way to develop software. Having a working system at every step of development is extremely beneficial. It may seem like building on top of working systems results in a convoluted mess of software layers, but these systems usually have a deeper elegance that takes study and time to appreciate. And of course, these systems have the unmistakable advantage of actually working. How many times have project teams attempted to build monolithic monstrosities only to watch as they collapse under their own weight? Software that evolves incrementally from one working version to the next has a much better chance of succeeding in the long run.

Tech Book Face Off: The Pragmatic Programmer Vs. Code Complete

Once you have received your degree, accepted your first job, and started practicing software engineering, you'll start to realize that there's a lot they don't teach you in college. Many professors tend to focus on the theoretical aspects of computer science and pay little attention to the practical aspects of software engineering. They may expect you to pick up the skills you'll use to practice your craft while you're on the job, or they may not even be aware of this side of software engineering. How do we learn what we need to know in the real world of designing software and writing code? I mean the everyday stuff: writing specs, reading code, adding features, refactoring, testing, reviewing, etc. There are actually quite a few good books that can give you a head start and keep you from making plenty of amateur mistakes. Here are a couple of the most popular classics:

The Pragmatic Programmer: From Journeyman To Master front cover VS. Code Complete 2 front cover

Even if you've been programming for decades, if you haven't read anything about the hands-on mechanics of software engineering, these books can give you expert advice on how to improve your programming practices. Alright, let's see what each of these books has to offer.

The Pragmatic Programmer: From Journeyman To Master


I thoroughly enjoyed reading this book. Andrew Hunt and David Thomas have a knack for packaging up programming wisdom into short, entertaining analogies, anecdotes, and parables that bring their points home in a memorable way. Even the titles of their chapters allow you to quickly recall the advice contained in each one. After reading through it once, you can just skim through the chapter titles to refresh your memory on their best programming practices. Here are some of my favorites:

Stone Soup and Boiled Frogs: Have you ever been on a project where you had a great idea for how to do something, but it would take too much effort to get approval for it if you asked? Try implementing a small part of the idea and present it to the team. Comment about how it would be so much better if it had this or that added to make it more robust or polished or whatever. Over time you may be able to get your idea in piecemeal, and the whole team will feel like they shared in the success, because they did. It's a win-win situation. On the other hand, don't let small changes to a project build up in a negative way, or you just might fall victim to feature creep like a frog in water being slowly heated to a boil.

Tracer Bullets: This is a great analogy for the design practice of getting a basic version working early and then iterating rapidly to improve it. They go through many of the advantages of this approach and some of the classic design mistakes that can be avoided by using it. It's an incredibly useful way to do design that every software engineer should know.

Programming by Coincidence: What happens when you start coding on a project without thinking? You plow headlong into your editor and bang out line after line of code. Then you test it out, fix a few obvious mistakes, and keep going. Eventually you start getting bugs that look a little weirder so you change code at random until the program appears to be working at a superficial level. In the end you'll have a mess of code that doesn't have a prayer of passing rigorous testing or actual customer use. This is programming by coincidence, and Andy and Dave go into it in much more entertaining detail, along with how to avoid it.

There are so many more gems just like these. I couldn't put the book down, and it was over before I knew it. I finished it wanting much more, and it totally changed the way I approach software engineering. In my mind, that's the perfect combination for a book like this. You should definitely read it. You'll get through it in a few evenings on the couch, and it will make you a better programmer. Highly recommended!

Code Complete 2


Maybe I was biased (or spoiled) from reading The Pragmatic Programmer first, but I just couldn't get into Code Complete 2. I had read many glowing recommendations for this book, and I know it is highly regarded in the software industry. But compared to Andy and Dave, Steve McConnell was kind of dry and long-winded. Now don't get me wrong, he filled this book to the brim with great coding advice, and it's almost four times longer than The Pragmatic Programmer so he covers an amazing amount of ground in 35 chapters. He goes all the way from software requirements to class organization to control structure to code layout to personal character. He covers everything having to do with code construction, hence the title.

There were certainly good things about Code Complete 2. The coding horror sections were always entertaining. If you've been reading old code - other people's and your own - for any length of time, you will definitely notice that many of the coding horror examples can conceivably happen in real code. They are pertinent examples of code gone wrong, along with explanations of why the conventions used in the bad code are confusing, error prone, and generally not a good idea.

If Code Complete 2 is anything, it's comprehensive. For the novice programmer, this is a great thing. Having such an exhaustive amalgamation of programming knowledge in one book can be a great introduction to good programming practices for the new initiate to the software industry. More experienced programmers may like the book as a reference or as a starting point for establishing coding standards on their projects, but as a readable work on the craft of coding, I found it somewhat lacking.

I really wanted to like this book. Everything I read about it recommended it as the definitive work on the subject, and it definitely is. But that doesn't make it an engaging read. It had no focus. Steve alluded to that fact in the beginning of the book where he claimed that when he went looking for the same information in the literature, he could not find it all in one place, that it was dispersed among a multitude of books, papers, and articles. His goal was to collect all that knowledge for future programmers. I believe he achieved that goal, but not in an especially memorable way. I can't immediately recall hardly anything that I read in Code Complete 2, barring a few code layout guidelines. I do distinctly remember that while reading it, I was constantly feeling like I had heard most of his advice before. Maybe that lack of new insights is what really kept me from getting into this book.

So Which One Should You Read?


At this point you may think the answer is pretty cut and dried - The Pragmatic Programmer, right? Not so fast. I still think Code Complete 2 is worth reading, just not all at once. You have to pace yourself. Don't try to read it all in a week or two. Those 35 chapters can be broken up into easily digestible snippets that you can read over the course of a few months.

If you do read them both, I would definitely recommend reading The Pragmatic Programmer first. Andy and Dave do a much better job of showing you how to become a better programmer rather than telling you, with clever anecdotes like the law of Demeter, the theory of broken windows, and "select" isn't broken. You are definitely going to remember a lot of their advice. Of course, there is a lot of overlap between the two books. With Code Complete 2 covering topics in more detail and adding more topics, it's a good in-depth reference following The Pragmatic Programmer.

Since Code Complete 2 is basically a super-set of The Pragmatic Programmer, I wonder if reading Clean Code: A Handbook of Agile Software Craftsmanship and Refactoring: Improving the Design of Existing Code would cover most of what Code Complete 2 does in about the same amount of space, but in a more engaging way. I'm not sure since I haven't read them, yet, but they are on my list. At any rate, The Pragmatic Programmer will pull you in like you wouldn't believe a programming book could. You may not be able to put it down. Go read it!

Growing Up With Technology

The other day my wife told me that my son can now turn on the computer, take a DVD out of its case, put it in the DVD drive, and make the right menu selections with the mouse to start a movie. He's two. Granted, sometimes he ends up watching Tinker Bell in French, but hey, maybe he'll pick up a second language. I find it amazing that he can manipulate a mouse that well when he can hardly get a spoon full of macaroni in his mouth without getting some up his nose and the rest in his lap.

This breakthrough got me thinking about how people always say kids can learn new technologies so much faster than the rest of us. Then I started thinking that it's not so much that young kids can learn faster, as it's that they don't have a huge catalog of old interfaces stored in their heads that they have to overcome when learning a new one.

Think about all of the devices you've used throughout your lifetime. I'm young enough that people have thought that I don't know what a record is, but I definitely remember listening to my family's record player when I was growing up. I can still recall the distinct crackling sound that the stylus made while picking up all of the dust and imperfections on the record's surface. I remember how the turntable worked, how the records needed to be cleaned, and how much space they took up in our living room cabinets. That's one of the first devices I can remember. Here's another.

My dad had a hand-wound 8mm camcorder that would take short video clips. He'd have to send the film in to be developed, and then he would splice the clips together into a movie. Every once in a while he would bring out the video projector, and we would all sit around and watch the home movies while he narrated. There was no sound recorded with the movies, but my dad did a great job of making it interesting on his own. I always loved hearing his stories about what was happening on the screen, and I still remember the whir of the fan and the clack-clack-clack that the film made when the spool ran out. Sometimes he would bring out the slide projector and narrate while he flipped through those as well. The experience was fairly similar, even down to the whir of the fan and the clacking of the carousel as it changed the slides. Both of these projectors had their own interfaces as well.

That's pretty much where I picked up in the stream of technology related to movies and music. There are many other technology categories, from communication to transportation to energy, but let's keep it simple and focus on forms of media. I just missed ATRAC, and I've lived through cassette tapes, Betamax, VHS, CDs, DVDs, and now MP3s, Blu-rays, and streaming video. Every one of these formats introduced new features that changed the interfaces on the devices that played them, and the device interfaces have continued to evolve on their own as well. Now Blu-ray players have their own GUIs and internet connectivity, and almost any device with a screen smaller than 30 inches is a touch screen - or will be soon.

My kids will never have known about most of those older devices. They'll never have to learn their interfaces. They're starting with the GUIs and touchscreens, and that's all they know. I know it's becoming passé to list all of the things that young people never knew existed, but what about things they stop using before they're able to remember? My son tries to touch our computer screen because he thinks it should work like an iPod or a Kindle. He recognizes that it's a similar screen and figures it should be touchable, too. He understands the mouse because he sees us using it, but touching is much more intuitive to him. It's quite possible that our main computer could have a touchscreen in the next few years, and if he's still young enough, he'll never remember it not being that way. From his perspective, every screen would be a touchscreen, and that's just the way things always were.

Here's another example. I don't plan on getting rid of our movie collection in the next few years, but theoretically I could. Netflix, YouTube, Hulu, Amazon, and massive hard drives are certainly adequate replacements, and if I did, my son wouldn't remember having to put a disc in a player to watch a movie. They would always be available to watch anytime on any device. If you didn't have to forget the old way of watching movies and learn yet another new interface, you'd probably pick it up pretty fast, too. It's the constant forgetting that gets hard, or maybe exhausting is a better word.

Now that interfaces are almost entirely software, they are evolving and proliferating faster than ever. That means we have to forget the old to learn the new at an ever faster pace. Our kids are growing up in this environment. They are immersed in it, so they are much more comfortable with it than we are. I wonder if they will reach a point in their lives where they suffer from new interface exhaustion, or if that wouldn't happen to them because their brains are wired differently from growing up in a much more interactive, visually immersive world.

Our children are certainly growing up in an environment that has incredible learning tools for quickly developing a markedly different skill set than we did growing up. The right video games, computer applications, and high-tech toys (e.g. Lego MINDSTORMS) can greatly improve creative problem solving, logical reasoning, analytical thinking, and abstract modeling skills. Researchers have even found that people that play video games have better surgical skills. Of course, you should be selective about what type of games your kids play. World of Goo is an excellent game for learning about physics in a whimsical world for children. Call of Duty...not so much.

To succeed with these games and apps, you have to form a model of the world in your head. You need to learn the physical rules that govern what you can and can't do in the world, and if it's a 3D world, you'll also be creating a mental map of the game world so you can move through it effectively. Often times this mental model is not based in our physical universe at all. There's a lot of abstract thinking and creative problem solving going on there, and kids are learning without even realizing it because it's fun. We should definitely be encouraging that kind of learning because they are much more likely to retain the knowledge if they are so engaged in the process.

That's a powerful skill set for them to have for the creative jobs of the future. When it comes time to design the next advanced computing device or energy technology or medical delivery system, these kids' minds will be well prepared for the task. They'll be able to dream up novel solutions to these hard problems because they're wired to think in entirely different ways than we are. I'm looking forward to seeing what they come up with.