MBT Mini project

Introduction

In the practical part of the course you’ll take the role of a Quality Assurance team for a small software project. Your team will have to do the following:

  • build/install the project
  • do some exploratory testing
  • write some test script
  • model (part of) the software
  • and finally, generate more tests from your model and write an adapter to execute those tests on the SUT

This mini-project is split into different assignments. You’ll have to submit the result of each assignment to the [fire] system. In addition, some assignments require you to present your findings to the class.

Mailing List

There is a mailing list for the course, to which we will send important announcements and information. Subscribe at [forum].

Important dates

Deadline for Assignment 1: Forming groups and choosing your SUT: Monday March 30

Presentation Assignment 2: Exploratory testing: Wednesday April 1st

Deadline for Assignment 3: Automated tests script: Thursday April 30

Deadline for Assignment 4: Modeling: Monday May 11

Presentation Assignment 5: Test generation and execution: Wednesday May 27

Deadline for Assignment 5: Test generation and execution (report): Tuesday May 26

Assignment 1: Forming groups and choosing your SUT

First, you need to form groups of 3 students. If you cannot find other people to join, ask on the mailing list. If this doesn’t work either, contact the lab assistant. Groups cannot be more than 3 people and groups of less than 3 people will only be accepted under exceptional circumstances (contact the lab assistant if you think you qualify). The first part of this assignment is to choose your SUT (software under test). We provide you with a list of possible project that we think are good candidates for this course:

°=taken

Warning

Note that no two groups can have the same SUT. If two groups chose the same they’ll be attributed in a first come first served basis (where the submission date in fire counts).

If you have your own idea of a project (not among the proposed ones) please contact the lab assistant with a description of your SUT, the motivation behind your choice and if possible a link to the source code and/or documentation.

Note

Contributing back: All the projects suggested above are open source projects with a public issue tracker. While it is not mandatory for this course that you contribute back to the project, writing a bug report can be a great exercise and the developer would most likely appreciate feedback from your testing effort.

Be friendly and polite and if you have any doubts, ask the course assistant for help in crafting your issue report.

Important

All 3 members of your group must create an account in [fire]. The first one to register creates a new group and shares the group password with the other two. Once this is done, you can submit the first lab: create a simple text file (no .doc!) with your chosen SUT and the names of the three group members. You can include several SUTs in your submission and you’ll be assigned the first one which is still available. Deadline: Monday March 30

Assignment 2: Exploratory testing

Install the chosen SUT and start doing exploratory testing. That is

“a style of software testing that emphasizes the personal freedom and responsibility of the individual tester to continually optimize the quality of his/her work by treating test-related learning, test design, test execution, and test result interpretation as mutually supportive activities that run in parallel throughout the project.”

Test as many functions as you can and try to come up with difficult or unusual use cases that the software might not handle well. Read the documentation and check that the software behave as described.

Take some notes:

  • was the SUT easy to install?
  • is the documentation up to date?
  • is the documentation complete?
  • did you find some corner cases that aren’t properly handled?

Important

No fire submission in this part but you’ll have to present your SUT and your testing results during the course on Wednesday April 1st. Prepare a presentation of about 10 minutes describing:

  • Basic information about the SUT (name, purpose, main language...)
  • Your test process
  • The results of your tests

Assignment 3: Automated tests script

Write 3 test cases as xUnit scripts against your SUT. Your test cases should test 3 different functionalities of the SUT and be able to run without manual intervention. In particular, they should be able to decide automatically whether the test is a success or a failure.

You’ll need to find a way for your tests to control your SUT.

Important

Send your test script together with a report with:

  • A description of the tested functionalities
  • A description (in English) of your tests
  • An explanation about how you controll your SUT from your test scripts
  • Your findings and comments

deadline: Thursday April 30

Assignment 4: Modeling

In this assignment, you will create a model of your SUT that you will then use in the last assignment to generate tests. Your model should be an extended finite state machine that will allow you to generate interesting, arbitrary long tests sequences against your software under test. Your EFSM must have about 20 inputs, at least one internal variable and be able to generate infinite test sequences.

Your model doesn’t need to cover the whole SUT, in particular it has many functionalities, you can model only part of them. Start by modeling the most basic functionalities and add more details progressively until you reach the desired number of inputs.

Important

In your report, you should include the following parts:

  • The specifications (in English) of the behavior of the modeled part of your SUT
  • The list of system inputs in your model with their description
  • A description of the state space of your model
  • The transition table
  • A graphical representation of your model

See [ROS2000] for an example of how you can format your repport (only the modeling part, not the implementation part).

deadline: Monday May 11

Assignment 5: Test generation and execution

In this assignment, you will implement the model that you developed in assignment 4 using [ModelJUnit].

Start by implementing the model alone by implementing the interface FsmModel, then connect each action in your model to your SUT in a separate adapter class (See the ModelJUnit tutorial for an example).

ModelJUnit offers several testing strategies and coverage metrics. Read more about them in the ModelJUnit documentation (Testing strategies: AllRoundTester, GreedyTester, LookaheadTester, RandomTester ; Coverage metrics: ActionCoverage, StateCoverage, TransitionCoverage, TransitionPairCoverage).

Before running the tests generator, try to think about which strategy will work best? Which metric do you think is the most interesting to evaluate your model?

Generate tests using your model with the different testers available in ModelJUnit. Collect coverage metrics about the generated tests, report the collected value a table like the following.

  ActionCoverage StateCoverage TransitionCoverage TransitionPairCoverage
AllRoundTester        
GreedyTester        
LookaheadTester        
RandomTester        

Note

You might need to experiment with the size of the generated test sequence: obvously if your model has 30 transitions, ModelJUnit can’t possibly cover it with a 20-steps test!

Report the test length you are using.

Important

For this assignment, you will need to submit the following documents:

  • The java source files for your model and your adapter
  • A report addressing the following:
    • difficulties and remarks about the implementation of your model and adapter using ModelJUnit
    • how did you implement reset?
    • an explanation, in your own words, of each test generation strategy available in ModelJUnit (AllRoundTester, GreedyTester, LookaheadTester, RandomTester)
    • an explanation, in your own words, of each coverage metric available in ModelJUnit (ActionCoverage, StateCoverage, TransitionCoverage, TransitionPairCoverage)
    • the above coverage table
    • discussion and conclusion

deadline: Tuesday May 26