Rob Kuijt's Testing Blog
Test cases directly generated from Activity Diagrams! 
Wednesday, April 23, 2008, 01:57 PM - Testing, TMap®, Rosario
Posted by Rob Kuijt
I don't like manual work. Especially when I think it can be automated!
Last weekend I had a cool breakthrough in my effort to make the work of General Testers less boring!
I succeeded in creating Process Cycle Test cases directly out of the XML file of an Activity Diagram, I created in the Rosario April CTP (Activity Diagrams are part of new Team Architect functionality of the next Microsoft Team System).



Text from TMap® Next :
The Process Cycle Test is a technique that is applied in particular to the testing of the quality characteristic of Suitability (integration between the administrative organization and the automated information system). The test basis should contain structured information on the required system behavior in the form of paths and decision points. The process cycle test digresses on a number of points from most other test design techniques:
  • The process cycle test is not a design test, but a structure test: the test cases issue from the structure of the procedure flow and not from the design specifications.
  • The predicted result in the process cycle test is simple: the physical test case should be executable. This checks implicitly that the individual actions can be carried out. In contrast to other test design techniques, no explicit prediction is made of the result, and so this does not have to be checked.


TMap® is a registered trademark of Sogeti Nederland BV


Automating the techniques for creating test cases isn't done very often. Most of the effort is pointed at the managing of the test process as a whole and in the recording and executing of test cases. Clemens Reijnen wrote a nice article about this subject in his blog (clemensreijnen.nl). In the last 5 years I did some development on the specification of test cases. I succeeded in creating a set of small web based programs that can support the tester by automating some of the steps during the creation of test cases. But the connection between the Functional Design (f.i. the Activity Diagram) and the tool for creating test cases is done by hand.
UNTIL LAST SATURDAY!


Figure 1 Example Activity Diagram


Last Saturday I managed to make an interface between the Activity Diagram and my tool for creating Process Cycle Test cases.
The steps are very simple now:
1. Create the Activity Diagram( see figure 1) in Team Architect (Rosario April CPT),
2. Get the XML-file from the Activity Diagram (on my disk),
3. Generate the PCT Test cases (see the output ).

Ready for testing? If you want it to test completely manually? Yes!
But I'm not satisfied yet. I want to import these Test Cases into Camano. (Camano is also part of the Rosario CPT; Camano is a tool for automating the manual work of the Generalist Testers)
My first experiences with Camano are positive! The UI is nice intuitive! And I like the support for documenting defects and performing regression tests!

So in the coming weeks I'll try to get connected with Camano!!
(later more...)



add comment ( 154 views )   |  0 trackbacks   |  permalink   |   ( 3.1 / 371 )
Testing in the Lifecycle [ALM]... a focus on test coverage 
Sunday, April 13, 2008, 10:53 AM - ALM, Testing, TMap®
Posted by Rob Kuijt
When looking at Testing and more specific Test coverage in the Lifecycle [ALM] you can conclude that much effort is done to test as good as possible, but nobody can tell you what Test coverage is achieved in the successive stages of the ALM.

Work must be done in the thinking and communication concerning the quality levels that should be reached! It is proved to be very difficult to choose the thoroughness of testing and it is proved to be even more difficult to explain the executed Test coverage to the colleagues of the next test levels.

With the appearance of the chapter 14 "Test Design Techniques" in TMap® Next , there is now some light on the horizon. In the "old TMap®" the Test coverage was expressed in terms of dynamic and static quality characteristics, coupled with test techniques, which nobody understood. Even the full-time testers had trouble understanding it through and through.

With the introduction of TMap® Next the test coverage is, more friendly and intuitively, expressed in terms like paths, decision points, CRUD (coverage of the basic operations), checklists, and so on...
Now we can explain the chosen Test coverage practically in plain English!

I can give an example how we introduced this type of Test coverage expression in ALM in a project I've done within Sogeti. In the project we planned a serial of 5 successive test levels: Unit Tests (UT), Component Integration Tests (CIT), Technical End-to-end Test (EET), Functional Acceptance Test (FAT) and User Acceptance Test (UAT). Instead of designing the tests on an individual bases we created one overall "tuned" test strategy.



Picture: Clemens Reijen; from his article:
Testing in the Lifecycle [ALM]... a focus on automation


This overall test strategy was designed in three layers:
  • First; for all the test levels we determined the Basic Quality level, which can be seen as the absolute lowest level of Test coverage (labeled: Bronze). Formal escalation is needed to escape from this Basic Quality requirement. And of course the depth of testing is expressed in terms of chapter 14 of TMap® Next.
  • Second ; based on the BDTM-approach (see chapter 3.1 in TMap® Next) risk classes are determined for each combination of characteristic and object part. Characteristic= what must be investigated; Object part= what must be tested. The Test coverage above the Basic Quality level is, for all test levels, determined in a so called Master Test Plan. In my experience it is easy to communicate with stakeholders when the higher Test coverage levels are labeled as Silver and Gold and even Platinum. And again the labels silver, gold and platinum are expressed in chapter 14 terms.

  • Third; we introduced the Learning Cycle. Every time a blocking or costly defect occurred we analyzed the defect and if necessary we modified the Test coverage definition of the test level where the defect should be found.
    Another example of working with the bronze, silver, gold and platinum labels can be found in chapter 7 "Development Tests" of TMap® Next.


ALM (Application Lifecycle Management) regards the process of delivering software as a continuously repeating cycle of inter-related steps: definition, design, development, testing, deployment and management. Each of these steps needs to be carefully monitored and controlled [Wikipedia].
For more definitions see the article about ALM Definitions in the blog of Clemens Reijnen.

TMap® (Test Management approach) is a registered trademark of Sogeti Nederland BV

add comment ( 176 views )   |  0 trackbacks   |  permalink   |   ( 2.9 / 14 )
How Popular is Your Blog? 
Monday, April 7, 2008, 10:19 PM - Fun
Posted by Rob Kuijt

How Popular is Your Blog?



Popularity of Blogs can be derived from the number of times someone reads an article. Another way to measure your popularity is to count the number of times somebody made an url-link to your blog. You can measure your score with the "link:"-feature of Google (link:<url>). In my case, my 3 weeks old blog has a score of zero.

The next Question is: How Popular is Your Blog in comparison with Others?



The nice thing about the "link:"-feature of Google is the possibility to measure it for other blogs as well. You can measure the link popularity for any blog you want. As a newcomer in the Testing Blog World it is very interesting to know what the most Popular Testing Blogs are. So I did some research.....collected a list of Testing Blogs......made a little job to get the Google score.........and Voila : I know what (today) the most popular Testing Blogs (from my list) are!

Interested? I made a tag cloud at http://robkuijt.nl/testingblogs (also on the right behind "More Testing Blogs...").

Do you want a Testing Blog in/out my list? Give a comment on this entry and I'll change my list next time I perform the measurement.

add comment ( 99 views )   |  0 trackbacks   |  permalink   |   ( 3 / 2357 )
Nobody Wants to Deliver Crap! 
Saturday, April 5, 2008, 04:01 PM - Quality
Posted by Rob Kuijt

Nobody Wants to Deliver Crap!

In the nearly 30 years I am working in the ICT world, I've seldom met someone who deliberately made errors to annoy his contractor, manager or whoever…. Everybody I met, made his or her best effort to do their job as good as possible!
Nevertheless it is obvious that after joining the individual parts, their (and also mine) team efforts mostly aren’t good enough. Lack of Quality is a costly and sometimes even a lethal problem.

Some examples:
• A poorly programmed ground-based altitude warning system was partly responsible for the 1997 Korean Air crash in Guam that killed 228 people. ( www.cbsnews.com )

• Faulty software in anti-lock brakes forced the recall of 39,000 trucks and tractors and 6,000 school buses in 2000. ( www.spaceref.com )

• Software bugs, or errors, are so prevalent and so detrimental that they cost the U.S. economy an estimated $59.5 billion annually, or about 0.6 percent of the gross domestic product, according to a newly released study commissioned by the Department of Commerce's National Institute of Standards and Technology (NIST).( June 28, 2002) ( www.nist.gov )

• During a meeting of the Mars Exploration Program Analysis Group Meeting in Washington Dc, NASA's John McNamee, Mars Exploration Program addressed the issue of the recent failure of the Mars Global Surveyor (MGS) spacecraft. Apparently incorrect software doomed the spacecraft.(January 10, 2007) ( www.spaceref.com )


Numerous quality experts did try to solve the problem of bad quality software with a huge amount of quality methods, so I won't try to add one of my own.....No , I have an opinion about Quality in a much, much smaller scale. Let's look at the following:

I think it is strange that, one ICT-assignment parallel given to, let's say, 20 individuals, results in many different outcomes; the change that their work has the same outcome is practically NILL!
And another thing I am amazed about: If you ask those 20 individuals, whether they are certain about the quality of their work. Mostly they can’t predict the results of the connecting test or review at all. Most of the time you get answers like: "I hope I did well" or "I think they won't find too many errors".

I hope she will not yell at me

I'm glad my dentist, my general practitioner or car mechanic don't work the same way!!
I think, if you look at the way the individual, in general, is doing his work in the ICT industry, huge improvements can be made, by investing in baby steps!
Assuming that nobody wants to deliver crap: how can I help an individual (let's call him Mr. A) to do his work the right way?


I think Mr. A can be helped with 5 Simple Rules in the 3 stages of his ICT activity:
1) The assignment
2) Support and/or coaching on the job
3) Delivery and feedback

1)The Assignment


Simple Rule 1: The Assignment for the individual must be SMARTER.
SMARTER is a mnemonic used in project management at the objective setting stage. It is a way of evaluating if the objectives that are being set are appropriate for the individual project . ( wikipedia )
A SMARTER objective is one that is Specific, Measurable, Achievable, Relevant, Time-bound, Evaluated and Rewarding.

The assignment must give ALL the information that is needed to do the job the Right Way the First Time. Including the required knowledge and experience, the reference to the instructions for the techniques and tools, the acceptance criteria (a mandatory list with quality checks), and a contact for asking questions (obligatorily documented; for instance by e-mail).
Before Mr. A starts his assignment he MUST know exactly what he must do to complete the job and gets his applause. It is very important that Mr. A realizes that only he can judge if the assignment is SMARTER "enough".

Simple Rule 2: Prevent Mr. A to do the best he can! Good is good enough!
An extra note on the "Achievable"-part:

Nobody can build a Rolls Royce for the price of a 2CV

An assignment concerning Functionality, Time, Money and Quality must be in balance! It must be clear at the start what must be done if the balance is disturbed!

2) Support and / or coaching on the job


Simple Rule 3: Fight the biggest enemies of Quality: Assumptions and Interpretations.
To fight the damage of assumptions, Mr. A can be helped in two ways:
First, we must Mr. A help to make the correct interpretations, so it is very important to give a proper kick-off. If Mr. A doesn't know how the product, he is creating, will be used, he never can make the right interpretations.
Second, Mr. A must learn to ask questions when he is not exactly sure about the choices he has to make during his activity.

3) Delivery and feedback


Simple Rule 4: Delivery must be done in a way that Mr. A is forced to know the quality of his product.
Delivery can only be allowed if the product is compliant to the exit criteria. Mr. A must make a statement; he did perform the complete set of quality-checks (exit criteria).

Simple Rule 5: Feedback must be done in a way Mr. A can learn from his mistakes.
If in a later stage an error has been found, feedback must be given in two ways.
First to create the learning cycle, Mr. A must give, after analyzing the problem, an explanation why the error has slipped through his own detection.
Second, of course, the error must be corrected.


I don't know if it works for Mr. A, but I do know the 5 simple rules helped me a lot! Feel free to try them. Suggestion: try them step by step...baby steps....


I'm curious.... Can 5 Simple Rules affect the ICT Quality problem?
In other words: Can 5 Simple Rules make the ICT world stop running.... to learn baby steps?

In theory....every little bit can help...
In practice.....probably not....but I'll see...


In the mean time... I am satisfied making progress with my colleague Developers and Testers in my own surrounding...


And for the rest of the world.....don't worry.....

you've never time to do IT right...


but always time to do IT over!




Rob Kuijt
1 comment ( 2535 views )   |  0 trackbacks   |  permalink   |   ( 3 / 2314 )
I'm Living in 2 Worlds 
Saturday, March 29, 2008, 04:03 PM - Testing
Posted by Rob Kuijt
I'm living in 2 Worlds.

In one world I'm working with Developers, trying to let them build software at a quality level the application user wants.
In the other world, at the other side of a "Wall", I am coaching Generalist Testers to do their job as good and efficient as possible.
Yes, my two worlds are separated by a fictitious Wall.
The Challenge: Developers and Generalist Testers, each at one side of the wall, don't make efforts to understand or help each other.

....don't make efforts to understand or help each other

D-world perceptions
In the D-world, the world of the Developers, we think Generalist Testers are pencil-pushing, nit-picky quality geeks. Mostly they are beside the point and are easily replaced. They seems to like making much noise about little defects, as if we made those errors deliberately....

T-world perceptions
In the T-world we don't hate the Developers for their perceptions. We are disappointed about the poor quality of the software.
Bad assumptions on the part of Developers are more to blame for the problems than are software weaknesses.
We never(or seldom) get software what will work right the first time. No, in the T-world we think that developers forget for whom they are building software, it looks like they are building for themselves......On the other hand we, Generalist Testers, realize that the Developers seldom get the time to do it right, but always get the time to do it over....But why are some errors made over and over and over again?


The Walls between Developers and Testers do exist for many years now. And probably there are good reasons to go on like this..... but in my opinion there are more reasons for tearing down those Walls.
With the growing complexity of our IT-world....The most important one:
DEVELOPERS AND TESTERS NEED EACH OTHER TO DELIVER THE PROPER QUALITY!!


I like living in 2 Worlds, but I love it when Developers and Testers collaborate.
I've taken the Challenge!!!

Next entry: Pragmatic View on Proper Quality
add comment ( 13893 views )   |  0 trackbacks   |  permalink   |   ( 3 / 2268 )

<<First <Back | 1 | 2 | 3 | 4 | 5 | Next> Last>>