October 2013

The 39th Test Management Forum took place on Wednesday 30 October 2013 at the conference centre at Balls's Brothers, Minster Pavement.

The meeting was sponsored by TESTPLANT and was, as usual, FREE to attend.

Session Summaries

Brindusa Axon, Unreasonable Minds, "Deriving scope from goals using stories"
User stories are a universally adopted technique for agile teams. They represent customer wants and needs and are units of valuable change of the system under development.

The main problem with stories is that we're using them to fit something we're used to work with, requirements. What we often ignore is that they are not a simple requirements fix, they are rather negotiable expressions of intent, meant to satisfy a particular need. The negotiable characteristic enables us to look for better options, pivot and even discard features that are not truly needed.

In this workshop we're going to try our hand at using stories to derive scope from business goals. This interactive session will not only give you the opportunity to experience story writing, but also explore the plethora of options to achieve a particular business goal.


Gordon McKeown, TestPlant, “Is GUI automation now an essential part of performance testing?”
Automating the application GUI is being used increasingly for performance testing. There are several compelling reasons for this:

  1. New Web technologies such as HTML5 (WebSocket API) and asynchronous AJAX requests are difficult to handle using the traditional HTTP “replay” approach.
  2. The rise of Mobile clients and innovative device types.
  3. The realisation that client-side behaviour may vary depending on server behaviour under load.
  4. Increasing application complexity.

It has also been suggested that the need for tester programming and technical skills can be reduced and that functional testing scripts can be re-used for load testing.

To keep the discussion concrete we will compare 3 different test scripts for the same Web application transaction using Image based GUI Automation, Selenium and HTTP “replay”.

The session is part briefing, part the pooling of experience and part the airing of views on an important topic for testing professionals.

Richard Neeve, “Software Developers In Test: The Supply-Side Crisis Facing Agile Adopters”
In 2001 I attended a SIGIST conference where a speaker stood before the audience, many of them weary veterans of previous methodology wars, and prophesied that the agile philosophy of which he was a devoted advocate would, unquestionably in his view, take hold and spread into the mainstream in a sustained way. Many scoffed at the notion, but he was right. Whatever you think of agile - and it has its detractors - the undeniable market reality is that many (perhaps even most) working environments are either genuinely agile, think they are, are actively moving towards being so, are seriously planning to be so soon or least genuinely aspire to be so in the future.

For a while we testing folk seemed collectively quite unsure about whether or how we fitted in to the agile paradigm but that time is long past and the conflation of development and test remits is one aspect of the consensus outcome, for good or for ill.

And yet despite the writing having been on the wall for so long and not withstanding deliberately provocative talks at the TMF about the limited future of vanilla manual testers, we (the testing community) have been sleep walking into being caught almost completely flat-footed in attempting to supply what is now a broad-based and rapidly growing demand for SDITs (Software Developers In Test). We simply are not geared up to supply these resources in the volume that can satisfy this demand and we are all chasing a very small pool of candidates. I believe this is one of the top challenges facing the testing discipline today; arguably the biggest.

This session will offer a reality check on where we are, identify the dynamics that led us to this position and seek ideas on what we can do to scale SDIT supply. Be warned, I don’t have the answers on that last point so if ever there was a TMF session that relied on active participants, it’s this one.

Richard's slides can be found here

Mike Bartley, “Requirements Management – turning compliance to business advantage”
Requirements-based testing aims to design a necessary and sufficient set of test cases derived from the product requirements to ensure that the design and code fully meet those requirements. It is both important from both a commercial and standards perspective.

A number of standards, such as ISO26262, mandate that certain safety related requirements have a demonstrable audit trail to implementation and signoff. A requirements-based testing approach helps with this.

From a commercial perspective it helps to ensure that a product meets all of its requirements. However, it can also help to ensure that every testing activity is associated with a product requirement. This helps to eliminate over testing which adds additional cost and can delay market entry.
In this talk we first explain requirements-based testing and the legal obligations of requirements based testing for some industries. However, rather than seeing it as a cost the talk explain how this can be turned to commercial advantage.

Mike's Slides can be found here.

Davidson Devadoss and Chris Comey, Testing Solutions Group, “Provision of Assurance on a Data Centre move in less than 90minutes!”
One of leading global law firms decided to move their large Asian datacentre from one service provider to another. This would involve physically moving hundreds of servers and a vast amount of storage, backup systems etc. to a new location over a weekend. As the firm operate 24/7/365, they have other datacentres in the UK providing service during the move. It was imperative that the new datacentre was able to provide service to all of their Asian fee-earners by the following Monday.

A Test Manager was engaged to develop a migration test plan and coordinate all of the testing activities to ensure that the client would not:

  • Lose business
  • Lose revenue
  • Lose productivity
  • Breach regulations.

By using an innovative, practical and collaborative approach to testing, TSG made it possible to successfully complete all of the planning, preparation and testing from the UK, confirming service availability within 90 minutes of handover from hosting team. Overall giving the confidence within astonishing time frames to stakeholders that the data centre move was successful indeed!

Paul Gerrard, Gerrard Consulting, “Introducing Test Analytics”
Paul defines Test Analytics as: “The capture, integration and analysis of test and production monitoring data to inform business and software development decision-making”.

Production monitoring data can be integrated with test (execution) monitoring data. Can this aggregated data *all* be regarded as ‘test monitoring’ data? There is only a small leap of imagination to treat the logging of system use in a production environment as the same as system tests in a test environment. The source is different but the nature of the captured data can be very similar – if we choose it to be so.

Paul will describe how data captured throughout a test and assurance process could be merged and integrated with definition data (requirements and design information) and production monitoring data and analysed in interesting and useful ways.

Clearly, it is easier to imagine Test Analytics fitting into a Continuous Delivery regime, but the discipline could and should be appropriate to some degree for all Agile and Structured approaches – at least in principle.