Thursday, June 30, 2011

The HICCUPPS heuristic, ethics, and future.

A while back I read about the HICCUPPS(F) testing oracles heuristic, which I thought was fascinating. As a challenge for myself, (and based on James Bach's encouragement to own my own methodology), I decided to try and come up with something that HICCUPPS missed.

If you haven't heard of HICCUPPS(F) before, you can read what HICCUPPS(F) means, from Michael Bolton's blog.

Ethics / Decision Infuencers
I came up with another consistency Heuristic, "Consistent with Ethical Standards". I emailed Michael Bolton about this, and he felt that Ethics can fit under "Consistent with User Expectations." I think he's right. But I also think that makes it easy to overlook ethical problems. An easy example (for me) is privacy. Daniel Solove has written entire books on understanding privacy, and has a taxonomy of around 20 privacy harms; the Safe Harbor website lists seven protection principles for data privacy. (As a side note, I think Solove de-emphasized the harm of long-term storage versus ephemerality, but that's another discussion). The point is that privacy and its details (and the ethics of data privacy) aren't really on most people's radar. That's changing, but slowly. I think a lot of people aren't really aware of how privacy affects them, nor of the ethical ramifications of various technical privacy controls. Mostly, it comes down to the fact that without highly visible use of our personal data, we can't really understand how our privacy is being violated. So how do our user expectations come into play, when we don't even understand the ramifications of privacy violations? Solove wrote an entire paper on the idea that people say "I've got nothing to hide" and how that misses the point of privacy. Well, several points.

So, what happens when users don't have high enough ethical expectations of the software they're using? Or what happens when users don't have privacy expectations they should--because they're ignorant?  What happens if programmers and testers don't understand privacy implications, or the history of privacy and how it is evolving?  I think there are ethical considerations that reach beyond User Expectations. I am not about to re-invent HICCUPPS(F) heuristic, but I might personally call it HICCUPPS(EF) or something.  I might also put it under the heading "Activist/Expert Expectations" -- noting of course that we don't want to be consistent with all activist and media expectations.  You might care very much about the reviews on cnet or consumer reports media outlets, though.  This also reminds me of something my Human-Computer-Interfaces professor said once: that expert users are also influential users: they help their friends and others decide which software to use.  So, when thinking about user expectations, also make sure you're thinking about influential expert users--past, present and future.

Future
Another shortcoming I thought about was that HICCUPPS doesn't explicitly focus on the future of a product. A product's future potential, future plans, and the changing nature of certain human processes makes it likely that software with poor designs will not be adaptable. Is the software maintainable? Or rather, is the software or feature consistent with likely futures of the product and its elements? This fits with the "H", only somewhat--so now I try to remember that when I ask about history, I should think about the future too. One example: a software product I tested once used building names to identify data records. However, it turns out that some of the records changed buildings, and that some of the building's names changed. The database wasn't flexible enough to allow for these changes. We worked around the problem, but it was still a problem (caused by management directives, I might add. The developer knew there were risks when he made the first implementation, even if they were underestimated).

The "Else" Question
The author of one of my favorite sci-fi books published some information on how to write books.  When inventing plots and motivations, he challenged writers to never stop with the first answer: always ask an "else" question.  Why else?  What else?  And so on.

So, my question is this: can you come up with something that HICCUPPS misses, or at least something it doesn't explicitly include?

In other words, what else are some of the shortcomings (or hidden details) of HICCUPPS(F)?

The Taboo Heuristic

A few months ago I came up with a software testing heuristic I thought was useful for me, which I will call the "Taboo Heuristic". Here's how I described it when I emailed James Bach to find out if there was already something like it--or if I had come up with a new test technique. I've modified the original message a bit...

Taboo (avoid the feature) - try to avoid using or finding a certain feature or feature set (while testing the help system or the function itself). Try to find a workaround or another way to accomplish a task. Try other ways of starting a feature. Think of features and buttons and ideas that might be "the wrong track" for a user to go down. Try searching for help without using the feature’s name (play “taboo” with yourself in the help system. Anytime you find the help, exclude the words that worked and try again).

This could be useful for testing feature findability/learnability; for testing help systems and search systems; and for looking for alternate ways to accomplish tasks once we've formed what we think is the "right" model of what users would do. We might find inconsistently named features, or extra choices that confuse users, or another way to accomplish a task (or user workaround) in a useful, surprising, or ordinary way.

This concept comes from a game where you are trying to get your team to say a word without saying the actual word, or a list of related words.

As an example, if you want to preserve a document without saving it, you could try printing, copy/pasting, exporting, and a variety of other things. If you wanted to send an email without the "send" button you might find a second send button you hadn't noticed before, or avoid doing it the fast way using keyboard shortcuts (the way ideal only for power-users).

Some things you can play taboo with include:
* Mouse actions
* Keywords and search terms
* UI elements
* Menus
* Input devices
* etc...

Movie Frames Heuristic

Do you remember when you were a kid, and you thought that wood fences and chainlink fences with plastic slats in them were opaque? Then, one day, you noticed that if you stared straight through the fences at what might be beyond, and kept moving sideways (gamers call this strafing) that you could dimly see what was beyond?

That's sort of the principle of how film works too--flashing frames at you all the time, while your brain misses the blank spaces between frames. It's pretty neat.

So, we see that some of these physical barriers to protect privacy are hardly barriers at all. If you haven't tried what I'm talking about--just remember that your computer and movies do it on purpose all the time.

But what happens if you're so busy staring through the fence that you forget to look at the fence itself?

I am talking about software testing, though. As developers, we often think of our solutions as opaque and solid. We don't think about the gaps, because when you are closely examining a fence--just standing and staring at it--you don't see how much information can get through.  For some applications, this might be okay.

I just read about how hackers will use SQL injection attacks to get data, one bit at a time if they have to. They get persistent, and tricky. They like those special effects. They understand those M.C. Escher drawings.  Some of them think laterally.

I've also read an article by context-driven testers about "blink testing". It's this idea that you quickly scroll through log files that look mostly the same, watching for patterns that stand out, and discrepancies that don't fit.

But the analogy can reach further. How can we think about our programs as moving parts? Do we ever use our debuggers to log the how and why of variable changes? Do we ever watch sped-up versions of network traffic, resource allocations, or just plain old User Interfaces?

Where else can these ideas be useful? How can you use motion, time, and gaps to make something visible that we thought to be opaque?

I call this the movie-frames heuristic, because as you switch rapidly between views of any product element, you might notice differences that, by themselves, don't stand out. Are there tools to help do this? I suppose that's a question worth looking into.