Do you remember when you were a kid, and you thought that wood fences and chainlink fences with plastic slats in them were opaque? Then, one day, you noticed that if you stared straight through the fences at what might be beyond, and kept moving sideways (gamers call this strafing) that you could dimly see what was beyond?
That's sort of the principle of how film works too--flashing frames at you all the time, while your brain misses the blank spaces between frames. It's pretty neat.
So, we see that some of these physical barriers to protect privacy are hardly barriers at all. If you haven't tried what I'm talking about--just remember that your computer and movies do it on purpose all the time.
But what happens if you're so busy staring through the fence that you forget to look at the fence itself?
I am talking about software testing, though. As developers, we often think of our solutions as opaque and solid. We don't think about the gaps, because when you are closely examining a fence--just standing and staring at it--you don't see how much information can get through. For some applications, this might be okay.
I just read about how hackers will use SQL injection attacks to get data, one bit at a time if they have to. They get persistent, and tricky. They like those special effects. They understand those M.C. Escher drawings. Some of them think laterally.
I've also read an article by context-driven testers about "blink testing". It's this idea that you quickly scroll through log files that look mostly the same, watching for patterns that stand out, and discrepancies that don't fit.
But the analogy can reach further. How can we think about our programs as moving parts? Do we ever use our debuggers to log the how and why of variable changes? Do we ever watch sped-up versions of network traffic, resource allocations, or just plain old User Interfaces?
Where else can these ideas be useful? How can you use motion, time, and gaps to make something visible that we thought to be opaque?
I call this the movie-frames heuristic, because as you switch rapidly between views of any product element, you might notice differences that, by themselves, don't stand out. Are there tools to help do this? I suppose that's a question worth looking into.