February 2014

The 41st Test Management Forum took place on Wednesday 5 February 2014 at the conference centre at Balls's Brothers, Minster Pavement.

For the first time, we charged a fee of £20 plus vat to attend. we had a fantastic response and despite a troublesome tube strike over 70 people attended. Thanks for your continued and future support.


13:30pm Tea/Coffee
14:00pm Introductions
14:15pm Craig Hodgson, "Test Automation – driving testing efficiencies throughout the delivery process." Susan Windsor, "Step Up Your Career!" Joanna Newman, "Continuous Improvement in Testing
15:30pm Tea/Coffee
16:00pm Peter Adrian, "Corporate Strength Agile" Richard Neeve, "Addressing The Apparent
Under-Supply Of SDETs: Take Two"
Mike Jarred and Matt Cardle, "Test Execution Prediction
Modelling and Reporting"
17:15pm Drinks Reception

Session Abstracts

Joanna Newman, Ericsson, "Continual Improvement in Testing – Using Test Exceptions and other data to improve your coverage, grow your team and improve the quality of your releases"

Test groups generate enormous amounts of data from test tools and as outputs from manual testing. We know that buried within it are indicators and trends that could help us improve the quality of our releases, reduce costs, or a combination of the two.

Following on from the July 2013 session on Big Data, this session will look at different techniques one can use to analyse test exceptions and other types of data for improvements that can contribute to a continual improvement program.

In this session we’ll discuss:

  • The differences between continual and continuous improvement programs
  • What types of data can lead to improvements
  • How long to spend on retrospective analysis
  • How to adopt a culture of review without slowing the team velocity
  • “What Now?” - What to do with the output of the analysis, and how determine and prioritize various initiatives, and how to monitor them for success

This will be an interactive session so I hope to hear of your examples of using test exception and other data to improve your test processes.

Susan Windsor, Gerrard Consulting, "Step Up Your Career!"
When you think about developing your career, do you think of technical skills or do some of the softer interpersonal skills come onto your radar? As IT professionals, it’s all too easy to prioritise our technical skills. However, with the emphasis today on successful collaborative working, good communications skills are fast becoming essential to maximising your contribution and achieving the recognition and reward you deserve. But where do you start?

This session explores the different aspects of interpersonal skills that really can make a difference to career progression. I’ll outline a number of topics that relate to my own career and then we can discuss and share experiences on the most relevant ones to the group. The topics I’ll introduce are:

  • Recognising communications styles, for yourself and your team mates
  • Understanding cultural and skill differences
  • Adopting a coaching leadership style
  • Re-inventing your personal goals
  • Identifying effective team attributes.

I’ll share some techniques to assist your development in each of these areas and include some hand-outs for you to take away and practice back at base.

With the pressure on testers in the current market, specialisation is critical. If you role includes leadership tasks or team working, you can benefit from improving your interpersonal skills. Softer skills are not a “nice to have” any more they are becoming essential.

Why not come along and identify your personal development action plan!

Richard Neeve, Independent, "Addressing The Apparent
Under-Supply Of SDETs: Take Two"

At last October's TMF I ran a session about the apparent under-supply of Software Development Engineers In Test (SDETs) and offered an analysis of its causal factors but there were two problems:

  1. I got a bit carried away with my analysis which meant that
    although I suggested some potential solutions, we didn't have
    enough time to discuss them in any meaningful depth.
  2. Despite making a promise to the contrary, my facilitation was
    too generous in allowing the discussion of more nebulous
    questions like "What is agile anyway?" and "Is the SDET concept
    just a marketing thing?".

This second talk will be kick-started with a brutally cut-down
version of my previous slide deck and will have a relentless focus
on what the solution(s) might be. The full-fat slide deck from my last talk and my notes from the discussion can be found here.

Mike Jarred, Director of Software Delivery and Matt Cardle, Test Delivery Manager, IDBS, "Test Execution Prediction
Modelling and Reporting"

Test departments have been criticised as providing information too late in the SDLC to influence outcomes. This contributes to testing being viewed as not adding value to projects teams by senior stakeholders. Accurate estimation, monitoring & controlling of test execution can provide early warning signals that testing may not deliver to time and scope targets. The same data that provides this information can also be used to model options that can be used by project teams to take corrective action to ensure successful delivery.

Matt Cardle and Mike Jarred work at IDBS and have been evolving usage of predictive modelling in their test projects, and know it provides valuable information to stakeholders. They are keen to progress this work further by discussing with other practitioners their experiences, or their views of how to bring greater benefit to IT projects by improved modelling.

Based on their experience this session will commence with the following:

  • Rationale of why prediction modelling adds value to projects and organisations
  • The data required to enable modelling to take place
  • The metrics provided by the data, and how the data and metrics are validated
  • An example of a prediction model for testing

This session will then explore the following:

  • Understand what other techniques for predictive modelling are currently being used
    • Understand what other inputs / metrics could be used when
      modelling outcomes
    • Challenges and constraints of using predictive modelling

    This discussion is aimed at senior practitioners interested in
    having visibility, control and predictability in non trivial test
    execution projects, where scope and time are fixed.

    Craig Hodgson, Centre4testing, "Test Automation – driving testing efficiencies throughout the delivery process."
    The role of test automation has shifted. Historically, adopters focused on latter stage system regression testing, largely against the GUI;
    replacing the manual regression testing burden and facilitating
    greater test coverage. Today, there is a drive to automate much
    earlier - whether at unit test level, integration testing or at
    build and deployment stages in CI environments

      • Let’s consider the benefits and challenges of earlier adoption
        of automation, including:
        • Facilitating more efficient testing from the first line of
          • If we automate earlier and reduce risk do we even need to
            automate system regression testing?
            • Who should implement and own test automation?
              • What’s in our toolbox – is a ‘one tool solution’ really
                • How do we measure success?

                In this session, we encourage participants to share their own
                experiences and consider how they might leverage test automation in
                ways they may not practice today.

                Peter Adrian, Sogeti, "Corporate Strength Agile"

                In the corporate environment we are faced with the challenge of delivering programme level objectives using highly flexible Agile teams, without undermining that flexibility.

                This is very similar to the problem faced by the military in the modern highly mobile combat environment. They need to allow troops at the front the flexibility to respond effectively to tactical situations whilst maintaining the overall operational objectives.

                This session takes how the U.S. Marine Corp have addressed these issues and how their approach could be employed in Agile development programmes. The session will consist of a short presentation and the majority of the time being a open discussion of the ideas raised.