Friday, October 28, 2011

Testing Heuristics

This is probably my second-favorite reference for testing.  Follow the link and find out why.
http://www.qualityperspectives.ca/resources_mnemonics.html

Pass/Fail vs Real Status

I just read an article "Addicted to Pass/Fail" on page 17 of  Tea time with testers (article by Rikard Edgren), and an associated blog post, "Binary Disease" by the same author

I wanted to compare this with the Low Tech Testing Dashboard.

You'll also understand this better if you watch Cem Kaner's Black Box Software Testing course videos on measurement theory in testing, and any related articles you can find by the same author.

This blog post is meant for study, mostly.  My questions are biased because I don't think a metric like "50% of our tests are passing" is meaningful.

Here are my questions for you (and myself):

  • Why is pass/fail data seemingly useful? To whom?  Why?  What decisions are made with it?
    • How powerful is each test?
    • How important is each test?
    • How important is each failure?
    • Which product areas have more failures?  How serious is each failure?
    • Which failures are acceptable?  Why?  Unacceptable?  Why?
    • Do these measures misconstrue subjective and fallible measures as being precise?
  • Why is the dashboard useful?  How does it compare to the pass/fail metric?
    • Why is a set of subjective measures better?
    • How are they better?
    • Why is it useful to differentiate the effort and quality in different feature areas?
      • Are you focusing on the right feature areas?
      • If no, maybe you need a different choice of feature areas?
    • Whose perspective does the dashboard represent?
    • Whose other perspectives are needed when making decisions?  Why?  How can they be obtained?
    • Who is the dashboard for?  (managers, usually--but all information workers, generally)
    • What does it report?  (testing progress and test team confidence, etc)
  • What might be the dashboard's shortcomings?
    • Is the dashboard the only communication tool? Can it work together with other tools, effectively?
    • Is the dashboard adaptable?
    • What other tools are needed?
    • Does the dashboard, together with other tools, serve a good purpose?  Does it mislead or misunderstand?
      • Multiple dimensions of quality?
      • Multiple values and audiences?  (initial vs late audiences? technical vs non? age/gender/ethnicity/experience/intentions/etc)
    • Is it too complex?  Too simple?
    • Does it enable informed decisions?  Does it only seem to provide good info for decisions?

Friday, October 14, 2011

Dennis Ritchie passed away.

http://www.guardian.co.uk/technology/2011/oct/13/dennis-ritchie

Dennis Ritchie's role in computing is probably more important, even if not as glamorous, as Steve Jobs's.  The obituary does a very good job of describing his fantastic legacy.

Wednesday, October 5, 2011

Gmail Strikethrough button

I wanted a strikethrough button in gmail.  It turns out you can get one.
http://userscripts.org/scripts/show/57725

How does this relate to testing?
* Oracles.  Could looking at what others are doing with your app (userscipts and extensions) tell you about functionality that's needed?  How does this fit HICCUPPS(F)?  Probably, it means that user expectations can be manifested through specific add-on functionality, among other ways.
* I didn't expect to find this.  It was, however, the top result in a google search I ran, idly trying to find out if I could solve my problem.
* What kinds of ways can we check google (or other sources) for problems people are trying to solve with our software?
* What are the risks (security-wise) of not providing certain features--in such a way that users are willing to download third-party add-on apps that could compromise your site or the user's data?
* How representative is this, really, of ordinary users?

Wednesday, August 10, 2011

Audience Engagement vs scalable on-demand conversations.

I just read an article on the e-content blog about audience engagement. Here is my response (I've edited it from the version I posted in the comments).

Back in the early days of the internet boom, I recall that a lot of talk show hosts, writers, and producers would schedule online chat sessions with hosts of users (I watched a lot of TV in those days, so I was more likely to notice what was going on in the TV world than other places). The trend seemed to drop off, though--online chat rooms don't seem to be the big thing for engaging with an audience, or maybe I am just out of touch. Personally, I only want to have *live* chats with friends who I see in person.  If I want to connect with a company or star in some field (like testing), I rely on less immediate communication.  (aside: This reminds me of Jakob Nelson's "Powers of 10: Time Scales In User Experience" where I'm on the one-week or larger scale).

These days, I see authors and publishers reaching out to users in a lot of ways--trying to be wherever their fans/users might be. This means they engage their audiences in multiple conversations over twitter, facebook, blogs, news columns, and their own sites. Because users have a variety of interaction styles, they seem to target a variety of consumers and interaction styles.

In video publishing, you can already see some cases where entire sites are modeled not on live broadcasting and communication, but instead focus on on-demand delivery. Youtube has features for video replies and written responses to videos. J.K. Rowling (author of Harry Potter) said she would read some of the fan-created forums in years past, and occasionally answered questions to open letters and questions submitted (by polls and popular vote) via the authors of the biggest fan sites. I wonder if today's stars hire media consultants, to watch all the various online streams and keep them up to date (this includes wikipedia) with current conversations and information.  I suppose that the answer would be "it depends on scale, popularity, etc".

We aren't seeing just single conversations. Publishers and producers have to cater to a wide variety of information consumers: wikipedians, twitterers, facebook users, bloggers, and people who search casually on google and other search engines.

Sites like Netflix and Youtube are trying to capture statistical trends and social communication/recommendation, and use them to their advantage. I don’t see them worrying as much about keeping people engaged in live video streams as they care about helping people find content and conversations they’re interested in (and encouraging them to share content within their networks). They rely less on live content, instead catering to what I would call “on-demand conversation”.  Historical content and niche interests are easier to cater to--so we see more diverse conversations that last far, far longer (I've replied to blog posts and conversations that are weeks, months, even years old, and had useful feedback in doing so; sometimes I reply more to learn than to be heard, but always hope to do both).

Another interesting trend is that big publishers provide ways for large and small audiences to converse. On large scale, amazon makes it possible for people to mark reviews as helpful or unhelpful so that the best (or just most popular) conversations (or reviews) rise to the top. Other publishers do similar things (digg stories rise to the top, facebook posts with many responses are more likely to show on the feed). It seems that flexibility, aggregation, popularity filtering, and links for expanding conversations, are the big features in what we might call scalable conversations.

So what is the future of publishing? We can predict the future only by the past, but the past is a bad way to predict the future. I'm sure we will see some innovative new ways to attract consumers and deliver content, and thus interesting developments in conversation scalability--and we might be as surprised at what doesn't work in the future as what does.

It seems that the trend is toward scalable, on-demand conversations, with an emphasis on the plural "conversations".  Allowing conversations to grow, and adding features as they do, seems important.  (Side note: I wish google reader would let me read comments to blog posts inline).

What does this mean for software developers? Probably, it just means that you need to look for venues, sites and services that already provide a platform for these conversations. Know what you're looking for, and use the services available to you.   And make sure to know the purposes of your conversation.

Friday, July 1, 2011

Bias, Search Strategies, and Converting MS-Word documents to MediaWiki pages

It's interesting that shortly after I posted about the "taboo" heuristic, I found a real situation that warrants a revision.

A couple days ago, I decided I wanted to quickly convert a lengthy microsoft word document to the mediawiki format. I couldn't find a way to do it. Today, I stumbled across a way to do it while searching for something else (a way to automatically upload images to mediawiki). I thank google for remembering previous searches--google uses these to refine my current results, and also I have a record of what didn't work, and what might have worked.

Here are the search terms that were fruitless:
* Convert HTML to wikipedia wiki
* html to wiki converter

The page I stumbled on later was this page, which explains that Openoffice can export to mediawiki format. It never occurred to me to search by the word "export". Why not? I can come up with a variety of reasons:
* I was looking for a converter. I didn't think to question if there were other words to describe my goal, or other ways to accomplish it.
* I was looking for something specific: going from HTML to wiki formats. I assumed that html was the most likely format from which to start. This was a fallacious assumption.

This implies something about google search strategies. As Jakob Neilson says, people rarely change search strategy -- in that article, he explains some of the problems with search skills, and how search engines can be made more usable.

In my case, I keep thinking "what heuristics can I come up with, that will help my search strategy in similar situations in the future?"

I have other search problems, too--what happens when I want to find a fun new game to play, but I have no idea what I like? What happens when I don't have a specific thing in mind--I just want father's day gift ideas?

In order to find out what heuristics work for changing a search strategy, I will need to come up with a new search problem, and a list of heuristics that might help.

New Search Problem (test my new heuristics)
I have been looking for an easier way to upload images to wikipedia. I haven't found a way to quickly upload images to wikipedia yet, despite trying to find solutions in the past. So, that will be my test case for a new heuristic. If it helps me solve my problem, I will know I have a success. I have a risk of a false negative. Also, the heuristic that works this time might not work for other search problems. I don't plan to address that here.

From here on, I am chronologically showing my thought processes and search progress.

New Heuristics to try on my new search problem
I'm brainstorming for ways to try and re-frame a search problem.
Here are some of the heuristics I thought of to try, and things that worked with the previous example:
* De-focus. Make my search terms less specific. Avoid exact file types.
* Find synonyms. Use a thesaurus to find similar words and actions, related to what I want to do. For "convert", I might need to do some actual research to find alternative ways of expressing my desire. Some examples might be save as, upload, export, convert.
* Find related actions. Break down the steps of what I want to do, by zooming out and then zooming in. With "convert html to wiki", I could zoom out to my main goal: get a formatted document into mediawiki without having to re-work the formatting manually. My starting document is MS-Word, but it could be any of a number of formats that any office editor can save to. This sparks some ideas: maybe I can use openoffice, abiword, wordpad, notepad. Maybe I have to use several in a row to get to a format I like (one being a pdf). With so many options, I need to focus on the destination file type first: mediawiki. So I should search google for "to mediawiki format", or some variants.

what happened?
As I continued searching, I came up with other ideas for switching up my search strategy--this is probably the most important element: actively thinking of ways to break out of a box. That's the most important heuristic.

Here are some of the strategies I tried: (remember, I was looking for a way to quickly upload images to wikipedia--specifically while I am already in the middle of editing an article):
* Think in terms of a less-tech-savvy user. What questions would they ask? Or instead, what complaints would they make? I searched for "too hard upload image wikipedia"
* Think in terms of the basics. "How to add image to mediawiki OR wikipedia" (less success)
* Think in terms of alternate related scenarios that don't quite fit my goal. "I need to upload a ton of images to mediawiki" (this seemed more successful-results looked more promising)
** Other scenarios involve automatically uploading images to other sites, integrated browser features, wiki image editors, and so forth.
* Look further down in search results (not much success)
* Click on something that doesn't look helpful
* Look at words in search results to get ideas for more search terms. (resulting terms: upload, script, import, robot)

Results:
It looks like there are basic robots I can use to do what I want, but a complete solution isn't available. I wanted to be able to paste from my clipboard into wikipedia, as quick and simple as that. The most successful close thing I found was http://meta.wikimedia.org/wiki/Pywikipediabot/upload.py -- perhaps I will have to build my own tool? By searching for "paste images to mediawiki", I found this fogbugz feature request, which shows me that others have been unsuccessful in finding a similar tool (so far).

Analysis:
One of the best google searches I tried, near the end, was "clipboard mediawiki image". I left out verbs entirely. Perhaps one search strategy is to eliminate verbs, nouns, or adjectives, or focus on exclusive lists of the nouns, verbs, and adjectives that will be most useful. Also, since nouns are more concrete, perhaps they are likely to be more successful. With this search strategy, I found a list of very promising results--which kept me looking further down in the results page and (unlike other searches) I went to the second page of results. This seemed to result in a near hit: someone bragging that they had made a windows tool for the job. However, there was no link to the project. I couldn't find the project by name or description, after that...

Conclusion
You have to use a variety of search strategies, and I am not sure what works best. It seems like searching for alternate scenarios, and reading more than the top screen full of promising results, was most helpful. Finding alternate words, in this case, didn't help a lot--but it's hard to prove it won't help in other circumstances. Dropping verbs helps. Looking for words and terms to drop (or change and generalize) is often more difficult and more helpful than looking for search terms to add.

In the end, I decided that the easiest solution to my particular problem was to just use the current mediawiki file upload features. However, it was a useful introspective study of some ways to change search strategies. Maybe I will compose a list of search strategy heuristics as I learn more.

Edit/Addendum
It occurred to me weeks later that I never thought about trying alternate search engines, specialized search engines, or just asking a question on a forum like superuser.com.

I also thought of a couple more search strategy changes that generalize on what I stated above.
  • Search by a problem statement
  • Search for similar problems that are not your problem (e.g. batch upload, batch remove, etc)
  • Search by solution keywords
  • Eliminate key search terms, eliminating assumptions
  • Add key search terms
  • Find synonyms. Use a thesaurus or a reverse dictionary.
  • Generalize any and every search term (e.g. mediawiki->wiki->upload/paste/...)
  • Get more specific on any search term (e.g. mediawiki->wikipedia->wikipedia:image ooh, wikipedia:file...)
  • Drop (all) search terms by grammatical type (verb, noun, adjective)
  • Switch to different grammar parts (switch from verbs to nouns, etc)
  • Try to think like someone else (less tech savvy user
  • Ask if a search engine is the right venue. (e.g. forums, libraries, real people, etc)
  • Ask if this search engine is the right venue. (find other search engines, specialized or not)
  • Search for other search terms using results pages, and especially forum conversations. This takes more time.
  • Keep looking "here": Follow links that don't look promising. Look at several more pages of results.
EDIT: 
In a conversation with a friend, they pointed out that another search re-framing heuristic is to find other infromation sources: switch search engines, find a specialized search engine, look at bibliographic data of unavailable/unapplicable solutions to find similar sources or ideas.

Also, I recently read Kaner's document on ET, where he suggests using the CIA Phoenix Checklist for problem solving.  I think it's very applicable.  You will probably find it if you search around a bit.

Thursday, June 30, 2011

The HICCUPPS heuristic, ethics, and future.

A while back I read about the HICCUPPS(F) testing oracles heuristic, which I thought was fascinating. As a challenge for myself, (and based on James Bach's encouragement to own my own methodology), I decided to try and come up with something that HICCUPPS missed.

If you haven't heard of HICCUPPS(F) before, you can read what HICCUPPS(F) means, from Michael Bolton's blog.

Ethics / Decision Infuencers
I came up with another consistency Heuristic, "Consistent with Ethical Standards". I emailed Michael Bolton about this, and he felt that Ethics can fit under "Consistent with User Expectations." I think he's right. But I also think that makes it easy to overlook ethical problems. An easy example (for me) is privacy. Daniel Solove has written entire books on understanding privacy, and has a taxonomy of around 20 privacy harms; the Safe Harbor website lists seven protection principles for data privacy. (As a side note, I think Solove de-emphasized the harm of long-term storage versus ephemerality, but that's another discussion). The point is that privacy and its details (and the ethics of data privacy) aren't really on most people's radar. That's changing, but slowly. I think a lot of people aren't really aware of how privacy affects them, nor of the ethical ramifications of various technical privacy controls. Mostly, it comes down to the fact that without highly visible use of our personal data, we can't really understand how our privacy is being violated. So how do our user expectations come into play, when we don't even understand the ramifications of privacy violations? Solove wrote an entire paper on the idea that people say "I've got nothing to hide" and how that misses the point of privacy. Well, several points.

So, what happens when users don't have high enough ethical expectations of the software they're using? Or what happens when users don't have privacy expectations they should--because they're ignorant?  What happens if programmers and testers don't understand privacy implications, or the history of privacy and how it is evolving?  I think there are ethical considerations that reach beyond User Expectations. I am not about to re-invent HICCUPPS(F) heuristic, but I might personally call it HICCUPPS(EF) or something.  I might also put it under the heading "Activist/Expert Expectations" -- noting of course that we don't want to be consistent with all activist and media expectations.  You might care very much about the reviews on cnet or consumer reports media outlets, though.  This also reminds me of something my Human-Computer-Interfaces professor said once: that expert users are also influential users: they help their friends and others decide which software to use.  So, when thinking about user expectations, also make sure you're thinking about influential expert users--past, present and future.

Future
Another shortcoming I thought about was that HICCUPPS doesn't explicitly focus on the future of a product. A product's future potential, future plans, and the changing nature of certain human processes makes it likely that software with poor designs will not be adaptable. Is the software maintainable? Or rather, is the software or feature consistent with likely futures of the product and its elements? This fits with the "H", only somewhat--so now I try to remember that when I ask about history, I should think about the future too. One example: a software product I tested once used building names to identify data records. However, it turns out that some of the records changed buildings, and that some of the building's names changed. The database wasn't flexible enough to allow for these changes. We worked around the problem, but it was still a problem (caused by management directives, I might add. The developer knew there were risks when he made the first implementation, even if they were underestimated).

The "Else" Question
The author of one of my favorite sci-fi books published some information on how to write books.  When inventing plots and motivations, he challenged writers to never stop with the first answer: always ask an "else" question.  Why else?  What else?  And so on.

So, my question is this: can you come up with something that HICCUPPS misses, or at least something it doesn't explicitly include?

In other words, what else are some of the shortcomings (or hidden details) of HICCUPPS(F)?

The Taboo Heuristic

A few months ago I came up with a software testing heuristic I thought was useful for me, which I will call the "Taboo Heuristic". Here's how I described it when I emailed James Bach to find out if there was already something like it--or if I had come up with a new test technique. I've modified the original message a bit...

Taboo (avoid the feature) - try to avoid using or finding a certain feature or feature set (while testing the help system or the function itself). Try to find a workaround or another way to accomplish a task. Try other ways of starting a feature. Think of features and buttons and ideas that might be "the wrong track" for a user to go down. Try searching for help without using the feature’s name (play “taboo” with yourself in the help system. Anytime you find the help, exclude the words that worked and try again).

This could be useful for testing feature findability/learnability; for testing help systems and search systems; and for looking for alternate ways to accomplish tasks once we've formed what we think is the "right" model of what users would do. We might find inconsistently named features, or extra choices that confuse users, or another way to accomplish a task (or user workaround) in a useful, surprising, or ordinary way.

This concept comes from a game where you are trying to get your team to say a word without saying the actual word, or a list of related words.

As an example, if you want to preserve a document without saving it, you could try printing, copy/pasting, exporting, and a variety of other things. If you wanted to send an email without the "send" button you might find a second send button you hadn't noticed before, or avoid doing it the fast way using keyboard shortcuts (the way ideal only for power-users).

Some things you can play taboo with include:
* Mouse actions
* Keywords and search terms
* UI elements
* Menus
* Input devices
* etc...

Movie Frames Heuristic

Do you remember when you were a kid, and you thought that wood fences and chainlink fences with plastic slats in them were opaque? Then, one day, you noticed that if you stared straight through the fences at what might be beyond, and kept moving sideways (gamers call this strafing) that you could dimly see what was beyond?

That's sort of the principle of how film works too--flashing frames at you all the time, while your brain misses the blank spaces between frames. It's pretty neat.

So, we see that some of these physical barriers to protect privacy are hardly barriers at all. If you haven't tried what I'm talking about--just remember that your computer and movies do it on purpose all the time.

But what happens if you're so busy staring through the fence that you forget to look at the fence itself?

I am talking about software testing, though. As developers, we often think of our solutions as opaque and solid. We don't think about the gaps, because when you are closely examining a fence--just standing and staring at it--you don't see how much information can get through.  For some applications, this might be okay.

I just read about how hackers will use SQL injection attacks to get data, one bit at a time if they have to. They get persistent, and tricky. They like those special effects. They understand those M.C. Escher drawings.  Some of them think laterally.

I've also read an article by context-driven testers about "blink testing". It's this idea that you quickly scroll through log files that look mostly the same, watching for patterns that stand out, and discrepancies that don't fit.

But the analogy can reach further. How can we think about our programs as moving parts? Do we ever use our debuggers to log the how and why of variable changes? Do we ever watch sped-up versions of network traffic, resource allocations, or just plain old User Interfaces?

Where else can these ideas be useful? How can you use motion, time, and gaps to make something visible that we thought to be opaque?

I call this the movie-frames heuristic, because as you switch rapidly between views of any product element, you might notice differences that, by themselves, don't stand out. Are there tools to help do this? I suppose that's a question worth looking into.