April 29, 2008

Back from family trip to the USA

Back from a ten days tour of New York City and Orlando - now I understand why some say that Disney World is a "once in a life-time" experience - it will take me a long time to recover from the masses of people and the long lines.

But enough crying, got a bug and a feature to share.

Bug:
See if you can spot the bug in this postcard (click to enlarge):


Feature:
The ride I enjoyed most in Disney World was Epcot's "Test Track". It is simulating a General Motors test department, with loads of displays telling the story of automotive testing industry, and its silent heroes - the crash-test dummies. Among the things I enjoyed to see were test result cards, formatted as you would expect with test ID, test scenario, expected result, actual result, passed/failed, etc. - the entire story! I felt at home.



When you get to the actual ride, you sit in a car and participate in a test session directed by a test manager and a test engineer, that take you and the car through a series of challenges, including steep hills, extreme weather, emergency braking (with and without ABS) and extreme speed drive around the facility.

What more can a tester want? (hint: not to find real bugs in the ride :-#)

April 06, 2008

"Error Guessing" as a software testing method

I admit that good testers can guess where bugs will be found (or at least say "I told you!" after they find them). BUT, I don't like seeing "Error Guessing" listed as a test technique in software testing books. If it can't be taught, it is not a technique (IMHO).

Look at ISTQB glossary definition (pdf!):
error guessing: A test design technique where the experience of the tester is used to
anticipate what defects might be present in the component or system under test as a result
of errors made, and to design tests specifically to expose them.

So, the tester should apply the old intuition to draw on her past and guess where the errors will hit this time. This sounds trivial on the one hand (learning from experience should be a basic human feature), but also very vague: suppose I have a tester with zero experience - can I teach her to guess errors? If I have 2 testers with the same experience - will they guess the same errors?

So, what can be done to improve your error guessing techniques?
  • Improve your memory:
  • Improve your technical understanding:
    • Go into the code, see how things are implemented, understand concepts like buffer overflow, null pointer assignment, array index boundaries, iterators, etc.
    • Learn about the technical context in which the software is running, special conditions in your OS, DB or web server.
  • Remember to look for errors not only in the code
    • Errors in requirements
    • Errors in design
    • Errors in coding
    • Errors in build
    • Errors in testing (we never make mistakes, do we?)
    • Errors in usage
An error-guessing story (wake up ego!):
A new team member was asked to plan tests for a web application screen that displayed items in batches of 25. I reviewed his test cases and asked him to add a test to search for missing items, or duplicated items around the "stitch" between the batches. After executing the tests, he came to me, saying that indeed the last item in each batch was duplicated, and appeared again as the first item in the next batch. He looked at me with awe and said "how did you know that this will happen?" I knew, because this is exactly what I missed when testing a paged-list a few years back - a customer found it for me... (Ego, back to sleep).


(CC)

April 05, 2008

ISTQB self-study group

We started an ISTQB self-study group at work last week. We got around 18 people on the interested-leads list, I expect the actual number will be much lower.

The idea is to use this book "Foundations of Software Testing: ISTQB Certification" as a study guide, meeting once a week to discuss current sections/chapters.

I read the first 3 sections, which cover the basic questions "what is testing" and "why testing is necessary". I felt the book was oversimplified and naive. To give you an example, the authors tend to go into lengthy analogs that bring little value (for example, comparing software product to a tomato in order to discuss definitions of "quality".

On the other hand, I think that for beginners it may do the trick, and serve as a clear introduction to software testing. I will write more as we progress.

April 01, 2008

bug/feature: dusty BSOD, Many Eyes

Bug:
Lengthy and detailed report of analyzing a Blue Screen Of Death situation, ends up with cleaning dust off the CPU fan and radiator, and presto! BSOD is gone.
Memo to myself - try it on my laptop (got a couple of BSODs last week).

Feature:
Many Eyes is a wonderful site that let's you upload data to share with others and use visual analysis in order to study your data and bring out points about it to the world in a persuasive way. You can also play around with data provided by others, to see what you can extract from it.
The collaborative nature of this site allows users to discuss the data itself, and the statistical inferences made by data users.

Innovative data visualization tools are a must in this age of accelerating data accumulation. The best example I saw Hans Rosling presentation in TED - I suggest to invest 20 minutes in this entertaining and genius presentation. Done? Go play with data in Gapminder website.