Annual Summits

Assurance Leadership Summit - Archive

Testing Tags: 

2007 Test Management Summit

Session 1A

Metrics, Estimating & Planning

 

Sam Clarke, nFocus

Sam's Powerpoint Sides

Sam's Workshop Notes

Delegate Notes and Feedback

Session 1B

Contracts, Acceptance Criteria and Outsourced Development

Susan Windsor, WMHL Consulting

Susan's Powerpoint Sides

Delegate Notes and Feedback

Session 1C

Supporting  Stakeholders with Feature Coverage


 

Peter Farrell-Vinay, Alphabite

Peter's Powerpoint Sides

Session 2A

Managing Offshore Testing

Margaret Edney, Thomson Financial


M
argaret's Powerpoint Sides

Delegate Notes and Feedback

Session 2B

Future for Test Automation

Sarah Saltzman, Compuware

Sarah's Powerpoint Sides

Delegate Notes and Feedback

MS Visio Mindmap

Mindmap GIF image

Session 2C

Test Management Models



Ivan Ericsson, SQS Software Quality Systems Ltd

Ivan's Powerpoint Sides

Delegate Notes and Feedback

Session 3A

Testing From The Start – Test Driven Development

Fran O’Hara, Insight Test Services

Fran's Powerpoint Sides

Delegate Notes and Feedback

Session 3B

Finding the Right Test Manager



Bogdan  Bereza-Jarocinski, BBJTest

Bogdan's Powerpoint Sides

Delegate Notes and Feedback

Session 3C

Improve Testing, Improve Software



Paul Gerrard, Gerrard Consultin

Paul's Powerpoint Sides

Delegate Notes and Feedback

Session 4A

Test Assurance – Ensuring Stakeholders get what they want


Paul Gerrard, Gerrard Consulting

Paul's Powerpoint Sides

Session 4B

What is the Big Issue – What causes issues to slip through the testing net?”

Colin Robb, Mercury

Colin's Powerpoint Sides

Delegate Notes and Feedback

Session 4C

Future for Test Manager Skills

 


Stephen Allott
, ElectroMind Limited

Stephen's Powerpoint Sides

Delegate Notes and Feedback

 

2008 Test Management Summit







Breakfast Facilitators Briefing

Coffee and Registration

Welcome and Introductions – NASH ROOM

WATERLOO

TRAFALGAR 2

ST. JAMES 1

ST. JAMES 2

Session 1A

Agile Testing

David Evans/Ivan Ericsson, SQS

PowerPointSlides

 

Session 1B

Process Improvement - Part of the Test Manager's Role? Geoff Thomson, 

Experimentus

PowerPoint Slides

Session 1C 

Top Ten Tips for Test Business Plans and Business Cases

Declan Kavanagh, Insight Test Services

PowerPoint Slides

Session Notes - Declan

Session 1D

Automated Regression Testing - is this the key to IT Change Assurance?

Sam Clarke, nFocus

Abstract

Powerpoint Slides (PDF)

Session Notes - nFocus

Break – NASH ROOM

Session 2A

The Future of Functional Automation

Duncan Brigginshaw Odin Technologies

PowerPointSlides

Session Notes

Session 2B

Transforming Business People into Testers and Test Managers - How? Paul Gerrard, Aqastra

PowerPoint Slides

Session Notes

Session 2C 

Protecting IT Service with a Federated Business

Ben Brundell and Nigel Redman, HBOS

PowerPoint Slides

Session 2D

Non-Functional within OAT

Stevan Zivanovic, Independent Consultant

PowerPoint Slides

Lunch – NASH ROOM

WATERLOO

TRAFALGAR 2

ST. JAMES 1

ST. JAMES 2

Session 3A

Getting Business Value out of Metrics

Richard Terry and Rob Baarda, Sogeti

PowerPoint Slides

Session 3B 

Application Security and Testing

Colin Robb, Hewlett-Packard

PowerPoint Slides

Session 3C

Keys to Successfully Hiring and Retaining Your Testing Team

Jane Muller, Pervue

PowerPoint Slides

Test Manager’s Forum Jan 30th

Session 3D

The Dandelion Model - Weed or Herb - Can negative attributes be good for testing?
Thorkil Sonne Specialisterne, Susan Windsor WMHL Ltd, Steve Allott ElectroMind, Anne Mette Hass, Delta

Abstract

Anne-Mette's PowerPoint

Thorkil's PowerPoint

Break – NASH ROOM

Session 4A

Managing 3rd party Supplier Testing

Tony Simms, Roque Consulting

PowerPoint Slides (PDF)

Session Notes

Session 4B

Accelerate Testing Cycles with Collaborative Performance Testing, Dan Koloski, Empirix

Abstract

Session 4C

ERP Maintenance Testing

Graham Marcus, Intercontinental Hotels

PowerPoint Slides

Session Notes

Session 3D

Dandelion Model Continued

Reconvene in NASH Room

Keynote: Paul Herzlich, "The UK Test Management Forum Survey"
Paul's Presentation

Drinks Reception – NASH ROOM

Dinner in Burton Room (with Merchant String Quartet)

Close

2009 Test Management Summit

The Third Test Management Summit took place on Wednesday 28 January 2009 at the sumptuous Institute of Directors at 116 Pall Mall, London.

The day was generously sponsored by our PATRONS, HP, SOGETI and SQS UK and our Summit Sponsors, FACILITA, NEOTYS and ORIGINAL Software.

You must be registered and logged-in to see the session downloads.

SUMMIT SESSIONS

Pragmatic Testing in Agile Projects

Stuart Reid

Testing Solutions Group

Getting Value out of Quality Center

Paul Rolfe

Hewlett-Packard

Managing Code Quality and Delivery in the 21st Century

Sebastian Paczynski

SQS

What Influences Me in Software Testing

Graham Thomas

Independent Consultant

Survival Skills for Difficult Times

Bogdan Bereza-Jarocinski,  Better Software

Exploring Open and Free Tools

Alan Richardson

Compendium Developments

Additional notes from Alan

Managed Outsourced Testing

Paul Godsafe

Independent Consultant

Load and Performance Testing Challenges for 2009

Gordon McKeown

Facilita

Knowledge Management on a Testing Team

Dan Prokopiwskyi

Sogeti

Exploring Why Software Automation Fails or Succeeds

George Wilson

Original Software

Why Aren't the Testers Testing Security?

Ed Hill

Hewlett-Packard

Performance and Test-Driven Development: Are they Compatible?

Alan Gordon and David Evans

SQS

Result-Driven Testing Business Alignment in Order to Show Our Added
Value

Derk-Jan de Grood

Collis BV

Performance Testing Rich

Internet Apps and Web Apps

Thomas Ripoche

Neotys

Agile Seems to Have Cracked the Unit Test Issue but What About the
Rest of Testing?

Sam Clarke

nFocus

The Wider Benefits of the CCTM Security Testing Scheme

Peter Fagan

Sogeti

Keynote Talk: Paul Gerrard

The Future of Testing: it's All About Value

2010 Test Management Summit

The abstracts and content for the 2010 TMF Workshops and Summit can be accessed through the links below.

Test Management Workshops - Abstracts and Materials

The downloadable material (Powerpoint, PDF or Word documents) can be found at the bottom of this page. the links will not be available unless you are a logged-in, registered user. You can register for the site here.

Alan Richardson, Compendium Developments, Exploratory and Innovative Testing Workshop

Does your test team innovate, and generate new approaches to your testing? As test managers we can build processes and shape our team culture to encourage innovation, learning and exploration.

This hands-on workshop will teach the principles and techniques of exploratory learning. You will use, and learn how to identify tools which augment your testing and we will discuss ways of identifying and building tailored approaches to help us innovate and get the most out of our teams and our testing.

This workshop for testers and managers willing to share their experiences and keep learning and moving their test process forward.

Equipment: Bring a laptop (with wireless or optionally a 5m network cable).

Susan Windsor, WMHL Consulting, Don’t shoot the messenger!

How many times have we taken criticism for being the bearer of bad news? Why are we misunderstood and not appreciated?

Getting our message over effectively requires others to receive it in a way they can understand. So, one key area we all strive to improve in the testing profession is in the area of communications. Being able to communicate effectively can enhance your career in your current organisation, improve your value as potential employee and provide you with greater self-confidence.

This workshop covers three elements of effective communication and Susan will stage exercises for each:

  • The science of persuasion
  • The art of story telling
  • Communications styles

At the end of the workshop, you will be able to identify how you can apply these techniques in your workplace.

Ray Arell - Intel, Building a SCRUM based Test Strategy

In this workshop Ray will walk through the creation of a validation test strategy that fits the Agile SCRUM development life-cycle.

This will include an overview of SCRUM and the benefits/pitfalls for validation teams, a sample test framework, a deeper dive into customer personas and how to use them effectively in sprint deliverables, defect management, a breakdown of test methods that work well and ones that don't and why, test architecture impact, training, and key measurements.

Also, Ray will give insight into his team's transformation from a waterfall development culture to SCRUM. This will include a deeper view of the people impact within a large paradigm shift.  This tutorial should be helpful for both managers and individual contributors who are at the start for their agile journey.

Paul Gerrard, Gerrard Consulting, Open Source and Free Tools Workshop

What can open source and free tools do for you? Surely they are unreliable? Where is the documentation? Where is the support? They are free but can’t do much for me, can they?

The number, scope and reliability of open source and free tools increase every day. In this session, We will explore the range of free tools available and how you can obtain, install and configure a few of them. The workshop aims to remove some of the mystique of open source, Linux-based and virtualisation tools. We’ll discuss how you need to adjust expectations, take a flexible, hands-on but realistic approach to exploring these new tools and opportunities.

Optional: Bring a laptop (with wireless or ideally a 5m network cable) if you want to download/try out the exemplar tools (to include Linux, OpenOffice, VMWare, Perl/Python/PHP, MySQL, Bugzilla, Watir, Selenium, Grinder).

Dave Evans and Mike Scott, SQS, Agile Test Management Workshop

In this workshop we will explore the most common issues and challenges faced by Test Managers working in an agile context.

Topics we’ll cover include test tooling strategy, test asset management, team size & configuration, management style, staff development & training, offshoring and distributed teams.  We will also take the opportunity to workshop as a group the issues of greatest importance to you.

George Wilson, Original Software, Taking Control of Your Test Data

Every organisation faces the challenge of delivering quality IT systems to support rapidly evolving business demands. The effective management of data is critical in meeting this challenge, especially when it comes to building a test environment for ensuring your applications do what they are required to do. This workshop will explore strategies and techniques that can be used to support QA teams with the creation, management and verification of test data.

Test Management Summit - Abstracts and Materials

The downloadable material (Powerpoint, PDF or Word documents) can be found at the bottom of this page. the links will not be available unless you are a logged-in, registered user. You can register for the site here.

George Wilson, Original Software: Agile Test Automation – Can it be Done?

Agile is a methodology that is seeing increasingly widespread adoption. Yet for the QA professional an Agile approach can cause discomfort. In the ideal world they would have a 'finished' product to verify against a finished specification. To be asked to validate a moving target against a changing backdrop is counter intuitive. It means that the use of technology and automation are much more difficult, and it requires a new approach to testing, in the same way that it does for the users and the developers. This session will explore how to define a test process to ensure application quality within an Agile environment.

Alan Richardson, Compendium Developments: Exploratory Testing Techniques

In this session we will discuss a range of topics on exploratory testing that are important to test managers and to provide answers to some critical questions to help you improve your team's exploratory testing.

  • When is it appropriate to use exploratory techniques and how do we provide feedback on value of this test activity to stakeholders?
  • How can test managers track the exploratory testing being done and remain sure that their testers are doing the 'right' testing?
  • What do managers need to know to effectively mentor their testers in exploratory testing?
  • What techniques do exploratory testers use?
  • What tools do exploratory testers use?
  • How do you recruit good exploratory testers?
  • What are the secrets to exploratory testing that will make the biggest difference to your implementation of it?
  • What are the steps you need to take to improve the quality of the exploratory testing done on your team?

Rob Lambert, iMeta: Agile is a mindset, not a methodology

Critics of Agile suggest it is chaotic and unstructured, but this misses the point. Agile empowers teams to define their own structures and frameworks that are right for that team, at that moment in time. This is because agile is a mindset, not a methodology. It's about getting things done the best way, not a set way. It's about adopting a new way of thinking, placing the power back with the people that need it, the team.

Gordon McKeown, Facilita: Establishing an effective performance testing environment

Creating and managing environments for effective load/performance testing is challenging and often “ends in tears”. After a breezy overview of the issues the session will be directed by the participants and we will share views and exchange experiences from the coal face. Topics to be covered include:

  • Test v Live; extrapolating from small scale to large.

  • Managing test configurations: who with what?
  • Testing needs versus security & other regulations.
  • Test data: logistical, legal and ethical issues.

Ray Arell, Intel USA: Case Study - Moving to an Agile Environment

A while ago I went into my software staff and declared “Hey! We are going Agile!” Yep, I read an Agile project management book on a long flight to India, and like all good reactionary development managers I was sold! A few years later our adaptation of the Scrum framework has taken shape, but it was not without strain on our development, test, and other Q/A processes. This session focuses on a retrospective of what went right and more importantly what went wrong as we evolved to our new development/test process and the effect it had on our team. This will include an introduction to the software validation strategies we developed and adapted for SCRUM; an overview of what makes up a flexible validation plan; how we defined an iterative test development methods and execution processes; how to define a customer persona to help test teams understand customer expectations on quality in each sprint delivery; exploratory testing and usage in the SCRUM development flow as well as the development of key checklists and done criteria that can enable a team to find success in the fast pace world of agile development.  Perhaps it will convince you that the shift to Agile is the way to go, and hopefully give you just a little more info on what you may be in for.

Graham Thomas, Independent: Test Process Improvement – Answering the BIG questions!

We all need to improve the testing process, but very few people actually answer the BIG questions, such as:

  • Why?  Is it just to save money, or do it quicker?

  • How?  Do we follow an accepted method – TPI, TMMI? What change methodologies are there that we can use?
  • What? Is it just automating test execution. What about planning, preparation, measurement and metrics, etc.?
  • Where and When? So where in our organisations, large and small, do we do this, and when is the best time?
  • And Who? Is this just a testing team initiative? Do we need help? Who else is involved? 

It is easy to ask the BIG questions but what we really want to know are the answers! This session will work through these questions to draw useful conclusions from the group’s collective experience.

George Wilson, Original Software: Maximising Your Test Automation Success

For at least 20 years, test automation has been heralded as one of the saviours of IT development. But manual testing is still a vital part of the software testing process – 80% of testing is still carried out manually due to the failure of automated testing processes. This session will explore what determines the short and long term success of test automation, what to automate and what to leave as manual and how you can jump to automation in a seamless and painless manner.

Alan Gordon, SQS: Open Source vs Commercial (Performance) Tools - pros and cons

An option to reduce your performance test tool costs to zero sounds attractive - so why isn't everyone using open source tools?  Alan Gordon from SQS will facilitate a session to discuss whether we could get more out of open source tools.  In which situations are they most useful?  What skills does your team require?  How will it affect my performance test estimation & strategy?  We will discuss our past experiences and the likelihood of increased open source usage in the future.

Paul Gerrard, Gerrard Consulting: Innovative Testing Practices

What are the emerging testing practices that have most promise? What have you tried and works for you? Importantly, what did not work? Current buzzwords include ‘testing in the cloud’, ‘testing in the crowd’, ‘lean test management’, ‘open source test automation’ and of course Agile. This session attempts to separate the hype from the reality and provide some pointers for the future.

Paul Godsafe, Independent: Non-Functional Testing

Why is this so often not done at all or done badly?  Can we build a Test Managers toolkit to help?  What would it contain?

There are a range of possible reasons why effective non-functional testing is not more common than it is today. In this session we will have the opportunity to explore some of these reasons and decide whether we can realistically address them by building a toolkit for Test Managers.

The proposition is that it takes much more than a toolkit to implement a worthwhile solution for delivering non-functional testing. We can investigate this with a view to determining whether there is scope for improving the uptake in these types of testing and whether Test Managers have a realistic chance of making a difference in this challenging field.

Susan Windsor, WMHL Consulting: Selecting our Testers and Measuring their Performance

This session will look at how to identify the skills and competencies your testing team will require (driven by your testing approach) and then select staff to meet your needs.  This includes reviewing their potential for contribution against a framework of Skills and Competencies; techniques for skills assessment; some pitfalls to look out for; and identification of training needs.

Come to this session if you want to learn more about this topic and if you’ve got your own experiences, hints and tips to share.

Sam Clarke, nFocus and Giles Davies, Microsoft: “How can I tightly integrate my testing into the Application Development Lifecycle?”

Testing has a role to play throughout the entire lifecycle of a software development project. However test is still often seen as a “necessary evil” getting in the way of progress. Even newer methodologies such as SCRUM struggle with how testing should be implemented. It could be argued that testing protects the reputation of the product, business and ultimately the company by reducing the risk of shipping unreliable or difficult to use software. This can cover many things for example software defects, missed requirements, usability, performance, security and manageability.

Sam and Giles will demonstrate how Microsoft’s Visual Studio 2010 enables full integration of the software testing process focussing on how facilities within the tool  help test deal with the problematic area of changes which could  potentially damage existing tested function. There will then be a workshop session where we will look at the critical success factors of fully integrating testing into the Application Lifecycle Management process.

Please note that this talk and workshop is applicable to iterative fast moving agile style projects as well as most other development methodologies.

Paul Gerrard, Gerrard Consulting: Regression testing: what to automate and how

Regression testing continues to be the main focus of test automation. However, regression and testing is still a bit of a black art.

How much regression testing is enough? What and how much could and should be automated? What is the measure of success for regression testing? What is the best approach to building, running and maintaining an effective regression test suite?

This sessions looks at what to automate and how.

Susan Windsor, WMHL Consulting: TMF Requirements Debate: “Testers need formal, written requirements to test”

Textbook approaches usually assume that we have perfect requirements, well-understood and agreed by our stakeholders.

But we know things are much more complicated. Requirements are never perfect, understood or agreed.

Should we, as testers press (and wait) for perfect requirements or should we go with what we have and just get on with it’?

This debate considers the issues.

Giles Davies, Microsoft and Sam Clarke, nFocus: TMF Test Manager Debate: Are Test Managers Necessary?

It could be said that a test manager is the project manager for the testing activities. If so, why don’t project managers pick up this role? Agile approaches seem to not mention the test manager role at all – is it necessary?

This debate considers whether test management is a role for an individual or an activity that others might fulfil.

Should we be seeking to manage our way out of a role or better establish our position’s value?

Giles will present the developer’s angle and Sam will put the tester’s case.

Mike Scott, SQS: Test Driven Development

What is TDD? Is it a technique for developers, testers or both? What do I need to know about it? How do I get my team started? Should I believe the excuses the developers give for not doing this? What tools and skills do my team need?

Mike will facilitate a group discussion around these and other pertinent questions in this interactive session.

2011 Test Management Summit

The FIFTH TEST MANAGEMENT SUMMIT will take place on Tuesday 25th and Wednesday 26th January 2011.
FINAL Summit Programme Now Available



Full Summit Programme



The Summit is sponsored by nFocus and Microsoft, Facilita, Original Software and IBM and is hosted by Gerrard Consulting.


Day 1 - Tuesday 25
January 2011

Test Management Workshops

Venue: Balls's Brothers, Minster Pavement

£95 plus vat

Day 2 - Wednesday 26
January 2011

The Test Management Summit

Venue: Institute of Directors at 116 Pall Mall, London

£100 plus vat
Dinner:£40 plus vat

To Book a Place

To book places at the Workshops, the Summit and dinner, download the booking form, complete and fax or email to us.

Booking form with new VAT Rate

Click "Read More..." for more information.

Day 1, 25 January: Test Management Workshops

We'll be presenting six Pre-Summit workshops on Tuesday 25 January at the
Ball's Brothers Conference Centre at Minster Pavement as follows:

9.00am-10.00am Registration, Tea/Coffee

10.00am - 13.00pm Sessions A, B, C

A. Susan Windsor, Gerrard Consulting: How to deliver Test Assurance

Test Assurance is a whole set of skills and competencies above and beyond Test Management. It's a role that is a natural career progression from Test Management, but not many people can make the transition, can you? It requires working directly for senior stake-holders, being independent from the Test Activity but fully aware of its status, recommending rectification action to get the project back on track, and most importantly reporting back to your senior stakeholders. This session is designed for people who want to learn about what's involved; so they can either undertake such a role themselves in the future or identify how to introduce such a role into their organisation.

B. James Lyndsay, Workroom Productions: How to Diagnose Bugs

Good testers need to be able to go beyond simply logging a problem. To give value to their stakeholders and integrate with their development teams, testers need to be able to investigate the problems that they find. Diagnostic skills will help a tester to isolate genuine problems from a rash of symptoms, to work out what lies behind field reports, and to communicate her bugs effectively by describing plausible models. In this hands-on workshop, we will use a succession of practical exercises based on real problems including truncation, bottlenecks, boundaries and emergent behaviours. Participants will select test conditions to isolate and emphasise a bug, analyse data to reveal connections and populations, and work with logs and events to arrive at sequences that reveal potential cause and effect. At the end of the workshop, participants will have an improved
understanding of the techniques and principles of diagnosis that can be applied to issues found in their own systems. Please note: we will test software - bring a laptop to get the most from this session.

C. Paul Murray, Rational QM Technical Lead, IBM: Introduction to Jazz and Rational Quality Manager

IBM is building a new generation of products to make software development more collaborative, productive and enjoyable.
Come to this session to get an introduction to the Jazz platform and learn how it is helping define industry leading Collaborative Lifecycle Management standards. We will introduce Jazz and its vision then dig deeper into how Quality Management is evolving on the platform.

LUNCH 13.00pm-13.45pm

D. Paul Gerrard, Gerrard Consulting: Using Stories to Example and Test Requirements and Systems

This workshop explores how stories can be used to provide examples of a system in use. Stories that are derived from written requirements can be used to walk-through business scenarios and when users 'see the proposed system in action', requirements anomalies stand out a mile and trigger discussion of scenarios, variations and outcomes. A disciplined approach to story-writing and requirements testing can improve requirements and the solution target dramatically. These stories can be used as examples for developers to see what was intended, but can also provide the basis for later acceptance tests.
Up-front requirements testing doesn't require extra effort - much of this work would be done during acceptance test preparation anyway. We'll use a simple case study and you'll get hands-on practice in writing stories, scenarios and examples. We’ll use the Testela Business tool to capture some business stories and show how these can be used to test requirements and extract test cases. You will see how a the tool can then assist with business impact analysis, regression testing, and even test automation. You don't need a laptop to write stories, but you'll need one (with any OS/leading browser) to access a wireless network to play with Testela.

E. Dan Crone, nFocus: A Deeper Dive into the Test Functionality of Visual Studio 2010

This session dives deeper into understanding some of the test functionality within Visual Studio 2010. As we all know, the collaboration between developers and testers is vital; it can make the difference between shipping quality applications and systems, or release dates slipping because of show stopping bugs found late on. In this interactive workshop, we will show you the full application lifecycle using Visual Studio 2010. This will include how developers and testers can benefit from integrated tools throughout the lifecycle. We'll show you how test case management tools can aide in creating and organising test cases, how we can easily replay application execution history, how actionable bugs can be created in the new Microsoft Test and Lab manager, analysing code churn using test impact collector and much much more!

F. Fran O'Hara, Inspire Quality Services:
Agile Test Management

Much of the focus with testing in an agile environment has been on practices/techniques and tools but what about Test Management? Test Managers want to know what happens to their role when teams are self-empowered and what a test management process looks like in an agile context. Using Scrum as the main agile method (but with discussion of participants own approaches/variations), this workshop will through presentation, exercises and discussion cover topics such as:

  • Transitioning the test manager role to agile… both from the project test manager and line manager perspective
  • Test strategy in agile in differing contexts
  • Test management process – estimation and planning in agile
  • Test management issues and their place in agile – metrics, process improvement, tooling, etc.

Participants, particularly those already working with agile, will be encouraged to share their challenges and experiences.

16.45pm Close

 

Day 2, 26 January: Test Management Summit

The Summit day will structured into four parallel streams with four sessions in each stream. At the end of the day, we will all convene in the Nash Room to hear John Higgins' keynote talk. The programme of topics is listed below the running order in each stream will be defined on the outcome of the Summit Topic Survey - below.

Timetable
9.00am Registration/Tea/Coffee
10.00am Sessions 1-4
11.15am Tea/Coffee in the Nash Room
11.45am Sessions 5-8
13.00pm Lunch in the Nash Room
14.00pm Sessions 9-12
15.15pm Tea/Coffee in the Nash Room
15.45pm Sessions 13-16
17.00pm Return to Nash Room
17.15pm Keynote Speaker: John Higgins
17.45pm Drinks Reception in the
Waterloo Room
18.30pm Dinner in the Burton Room

The Summit Topic Survey will Determine the Running Order

One of the key advantages of the Summit is that we survey the delegates to find out which combinations of sessions are most popular and we then structure the programme running order to reduce conflict so that everyone has the best chance of seeing exactly the sessions they wish to.

Keynote - John Higgins, Director General, Intellect

We are delighted to announce that we have engaged John Higgins, Director General of Intellect to present the Closing Keynote. His topic will be "Computing with Cuts, Carbon and Coalition": Some of the challenges facing the high tech industry and how we might deal with them.


Full Summit Programme

Launch of Mentoring Scheme

At lunchtime during the Summit, we will launch the mentoring scheme for Test Managers. We'll publish a proposal for how we suggest the scheme works soon.

Any questions?

Contact us here.

To Book a Place

To book places at the Workshops, the Summit and dinner, download the booking form, complete and fax or email to us.

Booking form with new VAT Rate

2012 Test Management Summit


The SIXTH TEST MANAGEMENT SUMMIT
took place on Monday 6th and Tuesday 7th February 2012

Designing for performance, Stevan Zivanovic, BJSS
Designing any testing is hard. The pressures imposed from projects are such that testing is expected to “just do it" and deliver. Is this fair? What can be done? How does this change when we deal with performance testing? This session will be highly interactive, exploring mechanisms for designing test approaches using the experiences of the participants.

Testing is a passion for Stevan. Over the 19 years that he has been actively involved, he has worked in a wide variety of businesses, from aircraft safety critical to financial “start-ups”. He has a pragmatic approach to testing and has a proven track record of implementing realistic, workable, real world changes from the individual to large, multinational teams. Stevan has presented at EuroSTAR, the UK Test Managers Forum, IQNITE, Agile Business Conference and various companies. He enjoys speaking and training people to motivate them in testing and quality.

 

Does testing improve quality? Ben Fry, SQS

“Quality” means different things to different people. To a user its meaning is different to that of a developer or a tester. It’s clear that conversations between senior and exec management focus on quality, not Testing. So, what role does Testing play in an organisation’s attempt to achieve an appropriate level of “quality”? And, what role should Testing play? We will discuss the different perceptions of “quality”, the role Testing plays, the role Testing should play, and ways we can improve the organisation-wide understanding and appreciation of “quality” and Testing.

 

Designing for Regression Testing, Nikhil Nashikkar & Vishnu Chittan, ThinkSoft Global

As the economic conditions hit, IT budgets are cut and more so the testing spend. While change is inevitable in IT organisations, the focus of the CIOs is to reduce the cost spent on regular so called “Business As usual activities”. From the test team’s perspective the changes need to be acknowledged when regression testing. This session will cover the following:

  • What components make a good cost effective regression test?

  • How do you measure success of a regression test?

  • Open discussion to the floor: How faster, smarter and better is the regression testing in your organisation?

     

Managing Remote Teams, Mike Bartley,T&V Solutions

“Susan, we need you to take over the team in X. Please make sure there is no drop in quality”

“Paul, the merger with Y means we have a team of software testers in Z to add to your existing team. That should solve your resource problems.”

“Jo, the CEO has decided we need to cut costs and outsource the software testing to X. Can you find a supplier and then manage the team please? Oh, and make sure the CEO gets his cost reduction!”

People are asked to manage remote teams for a variety of reasons and then expected to deliver to certain externally imposed targets.

In this session we will discuss the practicalities of managing a remote team and what needs to put in place to ensure that the various objectives can be met. We will ask what specific challenges are raised when the remote team is performing software testing.

 

Developing team skills, Mike Jarred and Luke Avsejs, IDBS
This session will outline the journey taken to implement a competency based Professional Development Framework for testing within IDBS. Most of the testers within IDBS have come from a scientific background so have huge domain knowledge; the PDF has been used to establish their level of skills and competency for testing, as well as show-casing their achievements to the rest of the organisation.
As well as sharing their implementation experience, Mike and Luke will host an interactive discussion on all aspects of skills development so the group can share knowledge and learn from each other.

Mike Jarred is the Director of Testing at IDBS, a market leading provider of innovative enterprise data management, analytics and modelling solutions which increase efficiency, reduce costs and improve the productivity of industrial R&D and clinical research. Mike has been in QA & Testing since 1990, implementing and developing testing teams in a variety of domains including Investment Banking, General Insurance, Retail and Private Medical Insurance.

Dr. Luke Avsejs is a Test Team Lead at IDBS with eight years of experience in IT involved with the development of staff. Currently working with pharmaceutical data management software Luke has a background in research chemistry and life sciences.



Crowdsourced Testing – Does it have Legs? Richard Neeve, Independent

Up until six months ago my interest in the burgeoning topic of crowdsourcing and specifically crowdsourced testing had been merely academic. But since then I’ve managed a client’s full-blooded embrace of crowdsourced testing, crowdsourced a name and logo for a startup and worked in a business that has crowdsourcing at its core. Previous TMF discussions on this topic have fallen a little flat and this one may too but I think the lack of concrete experience reports has been a problem.
Hopefully I can start to plug that gap by sharing what I’ve learned and trying to respond to the hopes and concerns people may have about using this approach in their own organisations.

Within a week of gaining a first class degree in computer science in 1998 Richard was recruited as a contract tester by Antony Marcano before rising through the ranks in a wide range of environments to become the BBC’s Head Of Testing in just his ninth year of experience.  In 2010/11 he worked with CxO level stakeholders to successfully deliver a critical troubleshoot for a city firm in the credit derivatives space despite having no previous financial experience.  He has maintained a 100% contract renewals record and has recently used commercial crowdsourcing to achieve objectives that have included, but also extended beyond, software testing.

 

Agile Techniques: Which ones really work in practice?, Paul Gerrard, Gerrard Consutling

After a brief brainstorm to set the agenda, this session will be run as a Goldfish Bowl. Come along and bring YOUR questions and experience to the group. Listen to people's experiences of what does and doesn't work.
We'll explain how a goldfish bowl works when you arrive, but put simply, it is a cross between musical chairs and a panel session where you take a seat on the panel to speak to the group. A civilised 'question time' that usually works very well at the TMF.

Paul Gerrard is a consultant, teacher, author, webmaster, programmer, tester, conference speaker, rowing coach and most recently a publisher. He has conducted consulting assignments in all aspects of software testing and quality assurance, specialising in test assurance. He has presented keynote talks and tutorials at testing conferences across Europe, the USA, Australia, South Africa and occasionally won awards for them. Educated at the universities of Oxford and Imperial College London, he is a Principal of Gerrard Consulting Limited and is the host of the UK Test Management Forum.

 

Testing phases, are they still relevant? Derk-Jan de Grood, Valori

Lately there is a lot of discussion about how test will be organised in the near future. Some state that the test phase will cease to exist. This session discusses that statement. Testing will be required throughout the development lifecycle. It is expected that test activities will shift from an independent phase near the end to the life cycle, towards various activities throughout the lifecycle. These activities include reviews as well as end-to-end testing and even testing in production. In order to decide whether a separate test phase has value, we need to regard two components. Testing has a technical and an intelligence component. The first has it focus on making the software work. The second component has its focus on governance and on providing information. We do not testing because we can, but we organize test activities because they matter. Each activity aims in not only finding bugs, but also in providing trust in an early stage of the development process. Is a lot of testing to be done by non-testers? e.g. The BA review of requirements, testing by developers in their SCRUM sprint. Plus, James Whittaker states that users will be involved in testing more and more. Is the testers role more of a coaching or controlling role? But this does not suit all situations. Lets consider where an old-school testing-phase might be the best solution. So we end up having more options, tools and measures to choose from. To do so, we require vision and understanding of the goals and the test profession.

Derk-Jan de Grood, works for Valori and is an experienced test manager and advises companies on how to set up their test projects. Derk-Jan published several successful books. Currently he is co-authoring a book on the future of testing. This book is expected medio 2012.

Load Testing in the Cloud – Benefits and Challenges, Bruno Duval and Thomas Ripoche, Neotys

Many companies have moved applications to the cloud as a way to reduce capital expenditure while improving IT focus and effectiveness. End users see the cloud as a way to access their documents and applications remotely from anywhere and from any device. IT managers see the cloud as a means of rapidly adapting their infrastructures as needed via virtualization using a pay-as-you-go model. But what about load & performance testing engineers? Can they seize the opportunities afforded by the cloud to better test the performance of web applications?
As with past overhyped trends in IT, it is important to look beyond the talk to find concrete ways to take advantage of this new technology’s flexibility and scalability to save time, reduce costs, and improve the way your organization works.
This presentation describes how the cloud is revolutionising load testing and discusses the advantages it provides in many situations to ensure your web applications perform well in production. We will also look at the drawbacks of only using the Cloud and investigate the key capabilities to look for in a load testing solution. Without the right tools in place, simply moving your testing activities to the cloud will likely not deliver the results necessary to justify the move. Understanding how to apply the right tools and practices to make the most of the cloud is fundamental to cloud-based testing and vital to ultimately going live with total confidence.

Bruno Duval has a master degree in Network and Distributed Systems and spent the last 10 years in the load and performance testing field. He has performed critical load tests in all industries and on all technologies. He is a real load test expert and has set up performance testing teams around the globe for several corporate accounts. In the last 5 years, he has lead the Neotys Professional Services team with consultants in Europe and America and actively participated in creating the groundbreaking Neotys Cloud Platform.   Recently Bruno realized numerous successful high volume load tests over the globe using NeoLoad and the Neotys Cloud Platform to generate over 100,000 concurrent users once again proving the strong expertise of Neotys on large projects.

Thomas Ripoche is VP of Sales EMEA & AsiaPac at Neotys. Through his career he has a vast experience in dealing with highly critical web performance testing projects. One of his main responsibilities is to enable the implementation of modern load testing practices using NeoLoad. This new generation of load testing tool, suited to today's web applications, generates more value for organizations while increasing efficiency and shortening load testing cycles.


Latest Open Source Successes, Paul Rolfe & Vishnu Chittan, ThinkSoft Global

Does the Test tool’s capability limit the degree of completeness that can be delivered? – While there could be a mixed response to this question, from a commercial stand point one has to consider what open source tools are capable of delivering.

This session will explore some of the recent successes and lessons learnt in practically delivering Test automation & performance testing using open source tools. It will be an interactive session with practical examples of open source tool problems solved to deliver desired results. The audience will also be encouraged to share their experiences.

 

Test Estimation – a pain or painless experience? Lloyd Roden, Lloyd Roden Consultancy

Test Estimation is one of the hardest activities to do well in testing, the main reason is that testing is not an independent activity and often has destabilising dependencies. During this session we shall uncover some of the common problems in test estimation, how to overcome them together with 7 ways we can estimate test effort. Learn how to estimate one of the vial missing ingredients when it comes to testing - estimation of quality. Learn how there is a direct correlation between the estimate of effort and the estimate of the quality that we supply to management.

Lloyd says … “With more than twenty-eight years in the software industry, I have worked as a Developer, Test Analyst and Test Manager for a variety of different organisations. From 1999 to 2011 I worked as a consultant/partner within Grove Consultants. In 2011 I set up Lloyd Roden Consultancy, an independent training and consultancy company specialising in software testing. My passion is to enthuse, excite and inspire people in the area of software testing and I have enjoyed opportunities to speak at various conferences throughout the world including STAREAST, STARWEST, EuroSTAR, AsiaSTAR, Belgium Testing Days and Better Software as well as Special Interest Groups in software testing in several countries.  I was privileged to be Programme Chair for both the tenth and eleventh EuroSTAR conferences and won the European Testing Excellence award in 2004.”

 

The Ten (false) Commandments of Testing, Morten Hougaard, Pretty Good Testing

The Ten (false) Commandments of Testing’ are:

  • You shall NOT start testing without requirements

  • You shall find EVERY bug

  • You shall use BEST PRACTICE

  • You shall be CERTIFIED

  • You shall AUTOMATE everything

  • You shall use THE Tool

  • You shall ENSURE (high) Quality

  • You shall use THE test technique

  • You shall use THE Model (Waterfall, V-Model, Scrum...)

  • You shall stay FULLY INDEPENDENT (from developers)

For many years, these (and more) false ‘commandments’, have quite heavily impacted the testing industry and done so much harm. And as an even worse thing, these ‘wrong messages’ (at least to a certain degree), still seems to be ‘preached out there’.
Morten invites you to participate in this discussion on what kind of (false) commandments we (still) meet ‘out there’ and how we can use whatever fit’s our situation best, becoming pragmatically (and thinking) test professionals, instead of merely ‘blind followers’.

 

Navigating the Sea of Siloes for Application Delivery, George Wilson, Original Software

A survey just over 18 months ago revealed discontent in the Application Quality Management (AQM) market, with 84% of users stating that their products did not meet their functional requirements. So why is it then that so many companies still rely on a plethora of management products to help with their application delivery? From defect management to bug recording, from requirements management to development reporting and from agile management to ALM dashboards, the fact is there doesn't
seem to be a solution that is able to provide a holistic view and complete metrics when it comes to managing and reporting on all aspects of software delivery. This session will explore ways to provide a real time dashboard of ALM processes so that you can plot your exact position and navigate your voyage successfully.

 

Mobile Testing: a mandatory task or an option? Michael Hentze, Tricentis

In 2011, sales of smartphones have increased by 74% compared with the previous year, and 2012 this trend will continue. A wide range of technologies, operating systems and application types do not only pose a technical challenge to software tests, but they also require considering basic requirements. Test managers must make strategic decisions on whether they use simulators or real mobile devices for their tests; they must guarantee security and integrate the test into existing ALM infrastructures. But: what must be tested and is this test mandatory or optional?

2014 Test Management Summit

Eventbrite - Test Management Summit - 29-30 April 2014

This year, we will run the Summit at the Balls's Brothers, Minster Pavement conference centre (our normal Forum venue) on 29th and 30th April.

Summit Day 1 - 29 April - Workshops

Morning Sessions 10:00am

Graham Thomas and Phil Isles – Programming for Testers: It is easy!

Paul Gerrard, Gerrard Consulting, “Putting Models at the Heart of Testing”

Ilca Croufer, IDBS, Statistical Process Control workshop

Afternoon Sessions
14:00pm

Alan Richardson – Introduction to Selenium WebDriver - the talky bits

Mike Scott, Paul Wilford, SQS, “Where do Testers add value?”

Paul Gerrard, Gerrard Consulting, “An Introduction to Exploratory Testing”

Summit Day 2 - 30 April - Discussion Sessions

Twelve 75 minute facilitated discussion forum sessions in three streams.

10:00am

Tom Roden, Neuri and Ben Williams, Independent Consultant, "Test Management doesn't exist in the maturing world of agile, or rather Test Managers don't.......do they?!"

Chris Ambler, TCS John Huxley, “Put Everything On 13 Black And See What Happens - Testing In The Casino World”

Alan Richardson, Compendium Developments, "Automation Abstractions"

11:45am

Steve Watson, Reed Business Information, "Oh the joys of recruiting testers! Finding those elusive good quality candidates”

Joanna Newman, Ericsson, "Why Test Matters – simple ways to raise the profile of Test (in a good way!)"

Paul Gerrard, Gerrard Consulting, "A New Model for Testing"

14:00pm

Debbie Archer, ISQI UK, "Professional Certification – What bugs you?"

Matt Robson, Testing Solutions Group, "Everyone is doing it better than you"

Declan O’Riordan, TestingIT, “The what? why? who? andhHow? of Application Security Testing”

15:45pm

Raji Bhamidipati, NewVoiceMedia, "Is Software Testing still a good career choice?"

Rainer Grau, Zuehlke, "Waterfall, Iterative, Agile, Flow…a story of continuous improvement"

http://www.rainergrau.com/
http://www.ireb.org/

Mike Bartley, Test and Verification Solutions, “Test Strategies for Business Critical Software”

17:00pm

Pause

17:15

Closing Comments – Paul Gerrard

17:35

Drinks Reception

18.30pm

Dinner

21:30pm

Close

Download the Discussion Session Programme

Eventbrite - Test Management Summit - 29-30 April 2014

2015 Test Management Summit

The Test Management Summit took place on 28-29 April 2015 at the conference centre at Balls's Brothers, Minster Pavement

 

Here is the content for DAY TWO:

http://owncloud.sp.qa/owncloud/index.php/s/kT32w7KM6fwISpj

Day 1: 28 April - SIX HALF DAY WORKSHOPS

Download the full Day 1 Programme with session abstracts

09:00am

Registration, Tea/Coffee

 

Session A

Session B

Session C

10:00am

Graham Thomas, Independent
"Techniques for Successful Program Test Management"

Declan O’Riordan, TVS
"Security – What can testers do now?"

Paul Gerrard, Gerrard Consulting
"Putting Models at the Heart of Testing"

11:15am

Tea/Coffee

11:45am

Session A continued

Session B continued

Session C continued

13:00pm

Lunch

 

Session D

Session E

Session F

14:00pm

Graham Thomas and Phillip Isles
"Testing the Internet of Things: the Dark and the Light!"

Susan Windsor, Aqastra
"Rapid and Effective Team Building Techniques"

Niels Malotaux, Independent
"Improving the effectiveness of reviews and Inspections"

15:15pm

Tea/Coffee

15:45pm

Session D continued

Session E continued

Session F continued

17:00pm

Close

Day 2: 29 April - TWELVE 75 MINUTE DISCUSSION SESSIONS

Download the full Day 2 Programme, with session abstracts

08:30am

Registration, Tea/Coffee

09:30am

Paul Gerrard, Gerrard Consulting: "The New Model for Testing - One Year Later"

10:00am

Session A
Max Latey, Pinboard Productions: "Software vs Project Test Management - The Great Divide?"

Session B
Steve Dennis, Independent: "Don't waste your money on Certification!

Session C
Viresh Doshi, UserReplay.com: "Continuous Test Automation"

11:15am

Tea/Coffee

11:45am

Session D
Susan Wilkes, Camelot Global: "QA Project challenges using multiple delivery methodologies across remote teams – not for the faint-hearted!"

Session E
Paul Gerrard, Gerrard Consulting:
"The Knowledge"

Session F
Gordon McKeown, TestPlant: “Who should cook and who should eat?” Tackling the burden of scripting"

13:00pm

Lunch

14:00pm

Session G
Steve Watson, Reed Business Solutions: "Managing offshore testing"

Session H
Thorsten Heinze, Independent Games QA Consultant: "How to work on a project with changing scope targets"

Session I
Alan Richardson, Compendium Developments: "Successful Test Automation for Managers"

15:15pm

Tea/Coffee

15:45pm

Session J
Mike Jarred, Financial Conduct Authority: "The Difference between Programme Test Manager and Head of Test"

Session K
Graham Thomas and Phillip Isles: "Making and Testing Internet of Things devices – The Pain and the Gain"

Session L
Huw Price & Ben Johnson-Ward, Grid Tools: "Seeing is believing – 2 identical web sites one with bugs one without – let’s find the bugs"

17:00pm

Drinks Reception

18:30pm

Dinner

 

2016 Test Management Summit

The 2016 Annual Test Management Summit took place on Tuesday 26th April 2016 at the conference centre at Balls's Brothers, Minster Pavement

Sponsored by Worksoft and Delphix

Timetable

09:00am

Registration, Tea/Coffee

09:45am

Welcome, Introductions

10:00am

Session A

Gary Hallam, Delphix, "The future of Test Data Management"

Session B

Mike Jarred and Paul Collis, FCA, "The Evolution of Test Governance and Assurance"

Session C

Edmund Pringle, David Parker, Metaswitch, "Developing Excellent Technical Testers"

11:15am

Tea/Coffee

11:45am

Session D

Jonathan Wright, Hitachi Consulting, "#DesignOps: DON’T PANIC! - The Hitchhikers Guide to Digital Engineering"

Session E

Steve Watson, Reed Business Information, "Diversification – is this the future for Test Managers?"

Steves presentation is here

Session F

Kevin Harris, New Voice Media, "Achieving Zero Bugs on the Test Environment" Kevin is presenting a webinar through EuroSTAR here

13:00pm

Lunch

14:00pm

Session G

Rod Armstrong, EasyJet, Asia Shahzad, Hotels.com, "Brave New World - Workshop"

Session H

Rob Vanstone XebiaLabs, "DevOps Overview - Lessons learned for 2016..."

Session I

Graham Thomas, Independent Consultant, "Instructional Games for Testers, Test Leads & Test Managers"

Graham's slides are here

15:15pm

Tea/Coffee

15:45pm

Session J

Mike Bartley, "Crowd Testing"

Session K

Isabel Evans, Independent, "Learning How to Tell Our Testing Stories"

Isabels' Slides are here

Session L

Dean Robertson, Worksoft, "Make Regression a Thing of the Past"

17:00pm

Pause - Closing Talk - "Tales of Assurance", Special Guest, Susan Windsor, WMHL Consulting

17:30pm

Drinks Reception

18:30pm

Dinner

21:00pm

Close

 

Summit Session Abstracts

Kevin Harris, New Voice Media, "Achieving Zero Bugs on the Test Environment"

We all know that we should be looking to test earlier at every opportunity, and that bugs that are found later in the development process cost more to fix.  This is a 75 minute Forum discussion to talk about how we could now be at the point where we should be able to prevent bugs from even getting to the Test Environment.  Is it possible to have bug-free quality code on the Test Environment?  If not, how far away are we, and what improvements can we look at to enable us to get there.

Possible steps to make this happen include: better communication about scope of testing at Kick-Off; striving for smaller stories; comprehensive test automation; Development & Test pairing; using Specification by Example (or alternatives); testing on Development machines (including remotely); reviewing and feeding back on automated tests in Development; running 5-Whys on all bugs found in Test; teaching your Developers how to test.

Do you feel your Test Team could get to this point? If so - then how soon? If not – then what’s blocking you?

 

Rod Armstrong, EasyJet, Asia Shahzad, Hotels.com, "Brave New World - Workshop"

In our interactive session Brave New World we will introduce a thinking tool that will allow you and your organisation to not only accept change, but to actively embrace it and follow a flow that will take that change through to a conclusion. We will illustrate the concepts with some examples of successes and also with some of significant failures within organisations that did not successfully apply this kind of model.  

We anticipate that this tool will be relevant to everyone no matter what their specific level or role is within an organisation and when applied can lead to you looking at all situations, both professionally and personally with a new perspective.

 

Edmund Pringle, David Parker, Metaswitch, "Developing Excellent Technical Testers"

As a company we place a huge emphasis on helping people develop to their full potential.  We have a great track record of helping new starters develop into solid testers and also do well developing testers on to become test managers or moving on to other roles in Development or Support.  However we have found it more challenging to help good testers who don’t want to move into management roles to become exceptional technical testers.  Part of the issue is that we didn’t have a great articulation of what a really great technical tester looked like to help inspire people.

To combat this, we put together a roadmap of testing skills that help people see where they are now and what they can push towards.  We’ve found this useful but it’s far from perfect.  We’d like to open it up for discussion, to share our ideas with a wider audience and to get some feedback to help improve it.

 

Jonathan Wright, Hitachi Consulting, "#DesignOps: DON’T PANIC! - The Hitchhikers Guide to Digital Engineering"

Join Jonathon Wright, Director of Digital Engineering at Hitachi to explore the digital delivery approaches including DesignOps that support modern day scaled agile and lean practices. It is estimated that 50 billion devices will be connected by 2020, providing data and analytics to direct new business models, personalised experiences and social innovation. So ... Don’t panic! Jonathon will share recent industry case studies testing Enterprise of Things (EoT) platforms including Smart Cities, Transportation and Energy.

Key Takeaways

  • Digital Disruption - accelerate the response to digital marketplace challenges
  • Digital Evolution - continuous assessment, insight, deployment and delivery 
  • AdaptiveUX - how to move between bi-modal (coreIT and fluidIT) to tri-modal digital delivery 
  • DesignOps - design practices (fail-fast experiments) through to operational feedback (measure/monitor)
  • DevOps - organization-wide cultural adoption of DevOps behaviours, processes and technology
  • Disciplined Agile - optimize the solution delivery life-cycle and deliver operational improvement

 

Graham Thomas, Independent Consultant, "Instructional Games for Testers, Test Leads & Test Managers"

The educational world has now recognised that we learn best when we are relaxed and having fun, where we have the freedom to experiment and learn through making mistakes.

This session introduces the audience to some instructional games that will help them, as testers, test leads and test managers, better understand the power of communication, and improve their communication skills.

Ten Hundred Words

The first game, the ‘Ten Hundred Words’ game, based on the #upgoerfive https://xkcd.com/1133/  is where we will describe testing, and test concepts, by only using words on the Ten Hundred most common words list – by the way thousand is not on this list. This helps us to communicate effectively, whilst not confusing our audience with technical double-speak (or jargon).

Test Concept Charades

The second game, ‘Test Concept Charades’, is where we have to explain common testing concepts without using the key words. Sounds simple until you try it and you have to explain complex concepts in plain language.

Both games combine to help us learn new and enhanced ways to communicate more effectively with others, especially with those who may not have the same level of technical understanding that we do.

 

Isabel Evans, Independent, "Learning How to Tell Our Testing Stories"

When we interact with other people we tell stories – with our words, our pictures, and with our faces and our bodies… As Terry Pratchett noted, we would better to call ourselves Pan narrans ‘the story telling ape’ rather than Homo sapiens ‘the wise man’

As IT practitioners on projects we can find that our messages are not heard or are misinterpreted. Sometimes we do not hear the messages other people need to tell us. Our natural abilities at storytelling and at appreciating the stories others tell us can be crushed under the weight of the ways we are expected to plan, report and communicate about the testing in IT projects.

In this workshop, I want to examine our ability to tell stories and how our natural delight in narrative can help us communicate about testing and quality to others.

We’ll look at why we tell stories, how we tell stories, and telling our stories in a way that is appealing to our audience. That means thinking about the role of oral, written and pictorial representations of stories, using the analogies of novels, serials, short stories, picture books, poems and songs. You’ll start to think about what story formats work best for your testing messages and how to adapt your testing stories to your audience.

We’ll also examine how we can better listen to other people’s stories about their parts of the project, about quality and about testing, and how we need to adapt our listening style to different storytellers when we are the audience.

I will tell stories and you will tell stories: we’ll listen to each other and learn to improve our listening and our narration.

 

Dean Robertson, Worksoft, "Make Regression a Thing of the Past"

Business and IT teams face intense pressure to deliver rapid change across complex enterprise landscapes. As a result, regression testing has become incredibly difficult to manage using manual and automated script-based approaches to QA. Fortunately, there’s a much better way to ensure every critical business process works as planned – even when applications change. Learn how high-velocity testing can help ensure unmatched quality through daily “lights-out” testing, shorten project timelines by more than 40% and eliminate thousands of hours of manual labor cutting test cycles by 90%+.

 

Mike Jarred and Paul Collis, FCA, "The Evolution of Test Governance and Assurance"

In 2015 Test Governance required a review as there were perceptions we weren’t as effective at testing as we needed to be, and the adequacy of our controls to ensure relevant testing was questioned.

Perceptions regarding effective testing were addressed, (data won over opinion), but the Governance question persisted..

  • Programme risks affecting testing outcomes were not adequately managed
  • Existing governance model provided little confidence relevant testing was happening, or that delivery outcomes would be successful
  • What did agile mean for Governance, how we interfaced with scrum teams? Was there a changing risk profile we needed to respond to?
  • Could governance improve collaboration with suppliers?
  • I wanted expertise in Test Group to support our test managers in delivery, could Governance evolve into a valued mechanism for our stakeholders & testers

Our model is evolving, discussing this experience and questions below will inform this evolution and share thinking in the audience. Cake will be provided to celebrate the 50th TMF.

  • Without Governance, how do you know if you need it?
  • What are other peoples experience operating Governance?
  • What are their TOR, how to measure success?
  • What are the inputs to Governance?
  • How important is cake, the benefit of serving it in Governance meetings?
  • With test automation becoming pervasive, how is the quality of test automation assured against understood risks?
  • The role of governance in DevOps?

 

Mike Bartley, "Crowd Testing"

Start-up success is about gaining traction with new ideas, which iterate quickly and persistently. Testing in this environment is challenging. To bring the right level of quality to the curious new customers, while ensuring new features and ideas make it to market quickly.

In this Customer led knowledge share we demonstrate how a crowd-based approach has been effective in balancing the challenges, and providing a long-term, scalable solution for the start-up as it grows.

The session will also discuss how Crowd Testing can add value to the QA performed by larger organisation

 

Rob Vanstone XebiaLabs, "DevOps Overview - Lessons learned for 2016..."

As DevOps and continuous delivery slip further into the mainstream, the question for most enterprises is becoming less “What is DevOps” and more “How do we get started?”

Hear from Rob Vanstone from XebiaLabs, as he looks back at lessons learned from implementing DevOps and Continuous Delivery initiatives in 2015 and what it means for 2016. Discover key takeaways that can help you make 2016 the year of DevOps in your organization.

The discussion will focus on

  • Skills needed for DevOps at Enterprise Scale
  • Balancing culture and tooling
  • Actionable advice from “Real World” implementations
  • Avoid pitfalls that delay your transformation

 

Steve Watson, Reed Business Infoomation, "Diversification – is this the future for Test Managers?"

I have been involved in testing for 28 years, and  a Test Manager for the last 8 of them. Testing is ingrained in me – cut me open like a stick of rock and you’ll probably see the word ‘Tester’ within! So the talk over the past few years about the future of Test Managers in an Agile world has been both interesting and alarming. Have I worked my way up to a role that will no longer exist in 5 years?

Who knows, but I was given the opportunity to try something different alongside my TM role last year, and I grabbed the opportunity with both hands, even though it took me outside of my comfort zone. I want to share my experience of becoming responsible for delivering a brand new product to market, despite having no track record in this area, the lessons I have learned, new skills developed, old skills sharpened, and mistakes made.

We will have an open discussion about the skills we as Test Managers possess that could help us diversify into other roles, and how it can benefit our careers.

 

Gary Hallam, Delphix, "The future of Test Data Management"

All companies experience frustration in getting regular up to date copies of good data into test environments. It can take days, weeks even months to provision and often all you are given are data subsets, a poor substitute. In the testing process it can take hours to reset the environment ready for re-testing and attempting to obtain data from archives for point-in-time analysis can be nigh impossible. Finally you know that the data should really be protected using masking, but this adds further weeks to the process (if done at all) and is massively expensive and complex.

What if all of this could be done using self-service in minutes? What if every tester could have their own complete and current copy of the data? What if they could refresh, reset, rewind or share that data whenever they wanted - without ever picking up the phone to operations. And what if data masking was just a seamless part of data delivery?

 

2017 Assurance Leadership Summit

The 2017 (Tenth) Annual Assurance Leadership Summit took place on Tuesday 25th April 2017 at the conference centre at Balls's Brothers, Minster Pavement

Sponsored by CapacitasTesthouse

Timetable

 

09:00

Registration, Tea / coffee

09:45

Opening remarks, welcome, introductions : Mike Jarred

10:00

Opening Keynote: Rob Lambert, "10 Behaviours Of Effective Employees"

10:45

Session A

Session B

Session C

 

Vishal Anand, Cognizant

Automation in an Automated World:  The world is automating and automating fast

Vishal's presentation slides

Harry Vazanias, Enfuse Group

Governance is dead – long live governance

Harry's presentation slides

Ani Gopinath, Sug Sahadevan & Mohamed Radwan, Testhouse

Blueprinting your DevOps Challenges

12:00

LUNCH

13:00

Session D

Session E

Session F

 

Max Latey, HSBC

Case Study: Shift Left and Agile in Banking

Max's presentation slides

Isabelle Cosar & Cliff Zattoni, FCA

Continuous Evolution of Performance Test Assurance

Isabelle and Cliff's presentation slides

Ben Fry, Armakuni

Comic Relief: 500 real-time donations per second, serviced over 100,000 concurrent web sessions, production changes turned around in 15 mins on event night

14:15

Goldfish Bowl discussion - Assurance in the digital age (Paul Gerrard / Mike Jarred)

15:00

Coffee & Networking

15:30

Session G

Session H

Session I

 

Thomas Barns, Capacitas

Why you’re spending too much money on performance testing

Thomas's slides are here

Andrew Ayres, ROQ

Quality Assurance – A Strategic Imperative in the Digital Age

Isabel Evans, Independent

Leading, following or managing? You can help your group thrive

Isabel's Presentation material

17:00

Closing Keynote: Declan O Riordan, "Will Risk Transference outgrow Risk Mitigation?"

Declan's Prezi is here

17:30

Drinks Reception

18:30

Dinner

21:00

Close

Abstracts

Opening Keynote: Rob Lambert, New Voice Media: "10 Behaviours Of Effective Employees"

The software industry is changing rapidly and traditional approaches to hiring and organising teams are rapidly out-dated. To attract the right people requires a different mindset from managers in many businesses. To become an effective employee requires a focus on attitude and behaviours, not just skills.   In this fast paced and fun talk, Rob will walk through why having more than "skills" is necessary for success at work. Rob will explain how he deconstructed his best employees and focussed on hiring for behaviours rather than competencies.   The 10 behaviours that will help employees remain relevant and managers hire the right people are:

  1. Being Visibly Passionate
  2. Being Aggressively Open-Minded
  3. Stepping Outside Their Job Role
  4. Becoming Company Smart
  5. Customer Focused
  6. Improving The Process
  7. Do What They Say They Will
  8. Communicate Effectively
  9. Self Learners
  10. Bravery

This is a fun and energetic talk and you’ll come away from it inspired to go forth and find effective employees.

Rob Lambert is Vice President of Engagement and Enablement at a fast paced technology company called NewVoiceMedia. Rob is also the owner of Cultivated Management, an online resource for those new to the challenging world of management. Rob started his career as a Software Tester, then an Engineering Manager and has now moved sideways in to HR.  Rob’s mission is to inspire people to achieve great things in their careers and to take control of their own learning and self development.  Rob is the author of Remaining Relevant And Employable, a book about remaining relevant and employable in today’s changing employment world.  He’s on Twitter at @rob_lambert  Rob is an advocate for many important social causes, is obsessed with technology in society and has written a number of books about software delivery, customer excellence and community building. He likes photography and classic Japanese cars.  He is married with three kids and lives in historic Winchester, UK.

Vishal Anand, Cognizant Technology Solutions: "Automation in an Automated World"

The world is automating and automating fast. Be it self driven cars or Alexa the assistant. AI and machine learning are the catalyst. Combine that with the persistent demand of faster rollouts driven by customer and business needs, and we have a very challenging role of ensuring quality.

This session will focus on the key challenges that these new trends bring to QA function specially how quality can be improved without impacting the agility and also controlling the cost.

The session will be an open discussion on challenges, solutions and way forward without being prescriptive.

Vishal Anand has been a career tester before moving into the management positions which spans to setting up and managing teams, handling multiple accounts simultaneously and holding portfolios.

Harry Vazanias, Enfuse Group: "Governance is dead – long live governance!"

Long standing ‘best practice’ governance approaches and frameworks are getting in the way of producing software for business purposes. In today’s fast moving digital world governance is more important than ever, with an increased need for secure, always-on IT. However, long standing governance models are not only failing to meet this need, but are making things worse.  Frameworks such as ITIL which have long been the model for most IT organisations are no longer fit for purpose, yet we are clinging to these, trying to adapt them, when in truth a completely fresh approach is needed. This talk is about the holistic challenges software delivery is facing, with real life examples, and will be presenting various thoughts and approaches to addressing this need

Harry Vazanias is an IT strategy and transformation expert. He is a prominent believer in the need for a new age IT function built on digital and DevOps principles. He has over 15 years’ experience in consulting, where he has lead and advised various business and IT transformations.  He is co-founder and director at Enfuse Group, having previously worked at Accenture and North Highland, where he headed up their Business Technology consulting services in the UK.

Ani Gopinath, Sug Sahadevan & Mohamed Radwan, Testhouse: Blueprinting your DevOps Challenges

This will be a facilitated session looking at the challenges of adopting DevOps for Testers and Test Managers. The result will be a crafted problem statement that can be taken back to the participants businesses as a discussion point to help solve DevOps related issues. The session will be facilitated by Ani Gopinath (Head of Delivery), Sug Sahadevan (CEO) and Mohamed Radwan (Head of DevOps).

Sug is the founder of Testhouse. He started his working career as an Electronics Engineer with GCHQ, Cheltenham and gradually progressed into programme management while working for Digital Equipment Corporation, Bank of America, Centrica and Deutsche Bank. He founded Testhouse in 2000 and currently focusses on its global expansion. He serves as an advisory panel member for the Winmark Global CEO Network. Sug holds a Bachelor’s degree in Electronics Engineering from University of Ulster at Coleraine, Northern Ireland.

Mohamed Radwan – Visual Studio MVP & DevOps Practice Lead Mohamed is a Visual Studio ALM MVP and DevOps Practice Lead at Testhouse; he focuses on providing solutions for the various roles involved in software development and delivery to enable them to build better software through Agile Methodologies and utilisation of Microsoft Visual Studio ALM/DevOps Tools & Technologies.  He excels in software practices and automation with 15+ years of professional experience spanning the different stages and phases of Software Development Lifecycle. He has helped various customers across the world in the UK, Denmark, USA, Egypt, Oman, KSA, Kuwait, and Libya, amongst others.

Ani Gopinath – COO and Head of Delivery  Ani serves as the COO with overall responsibility of UK operations and project delivery globally. He has been involved with Testhouse since formation and started as a technical consultant specialising in performance testing. An experienced test practitioner and trainer with over 20 years of experience, Ani had led Testhouse project engagements with IBM Global Services , Capgemini , CSC and Logica CMG. Ani holds a bachelor’s degree in Electrical and Electronics Engineering

Max Latey, HSBC: "Case Study: Shift Left and Agile in Banking"

The session will consist of two parts:

  1. A presentation of a case study: implementation of Shift Left and Agile in a Treasury Risk project.
  2. A round table discussion of the case study and strategies for introducing new concepts in testing to companies with established working practices.

The presentation will cover a recent attempt at implementing new working practices – including Shift Left – into a programme of business and technology change within Banking; an industry notorious for its aversion to change.

The round table discussion will give participants (and the speaker) an opportunity to discuss strategies for implementing new working practices or test approaches with both willing and unwilling users & stakeholders.

The discussion will also provide an opportunity for discussion of future trends in test management and assurance and the role we can play as champions of change.

Max Latey is a career test manager with a long standing focus on the user experience of testing and assurance.

Isabelle Cosar & Cliff Zattoni, FCA: "Continuous Evolution of Performance Test Assurance"

Over the past 18 months, our model for assuring the quality of Performance Testing has evolved. We have moved to a partnership relationship with our Performance Test suppliers and have changed our selection process to address a capability issue we had identified within the people delivering our Performance Testing. We have also moved to a supportive assurance mechanism with our suppliers rather than operate a governance based model.

We will explain how we currently operate this assurance model for Performance Testing in the Testing Department in collaboration with our Technical Operation partners.

Our organisation is moving to new delivery models where cycles to Production are shorter & feedback loops are quicker. We are wondering how to do less assurance, or quicker assurance or better targeted assurance in order to maintain quality. Is this possible?

With moving to a Continuous Delivery Pipeline & DevOps, we are at a crossroad where we need to rethink how we assure the Performance Testing we deliver is targeted, relevant, cost effective and in line with our stakeholders risk appetite.

Have your organisations tackled the problem of maintaining the quality of Performance Testing deliverables when delivery models have changed to CDM and DevOps?

  • What challenges did you face?
  • How have you evolved ?
  • Did you manage to maintain quality or has it gone down ?
  • And if you haven’t gone that road yet, how do you see your assurance model changing ?

Isabelle Cosar used to deliver Performance Testing at RBS, after a few years’ experience in automated testing. After she left RBS, she joined StarBase, a specialised Performance Testing consultancy, as Senior Consultant and then joined the FCA 4 years ago, and is responsible for the assurance of Non-Functional Testing.

Cliff Zattoni has had a hand in testing in almost every role held but primarily when running the Capacity function within Sainsbury’s for 6  years and now the FCA for the last 3  years. During this time he has helped shape and improve those test delivery functions to work seamlessly with Operations (run) and breaking the us/them blame culture that grows usually in that space.

Ben Fry, Armakuni, Comic Relief: "400 real-time donations per second, zero downtime, 100% success – and the ability to implement a change in production on event night in 15 minutes!"

 

Armakuni worked with Comic Relief to transform its legacy donations platform into a secure, highly scalable and resilient, micro-service platform, which functioned 100% successfully on the night.  In moving from a legacy Java monolith to a cloud-native microservice architecture, Armakuni supported the transformation from water-scrum-fall to Test Driven Development and Continuous Delivery using Cloud Foundry as the PaaS. This significantly reduced the resources needed to build, deploy and run a fully tested platform.

 

The following presentation will take us through this and other case studies, analysing technical and cultural obstacles and points of resistance. We’ll also discuss why Continuous Delivery barriers are no longer accepted as too big to overcome.

Ben Fry - Armakuni (Bio to follow)

Goldfish Bowl discussion - Assurance in the digital age (Paul Gerrard / Mike Jarred)

  • What are the attendees key challenges to assuring software delivery in the Digital era?
  • What does assurance look like, who does it?
  • What are the skills required, where do these people come from. Hire, redeploy, grow your own?
  • What are the challenges with toolsets supporting software delivery in the digital age?

Thomas Barns, Capacitas: "Why you’re spending too much money on performance testing"

Everyone knows load testing is expensive – tests take a lot of effort to set up, a long time to run, and even longer to work out why you’re seeing issues. On top of that you need big costly environments and toolsets which eat through your budget in no time. We’ll look at a number of examples and discuss how we can reduce the cost of performance assurance through shifting left, designing smart tests and automating problem detection.

Thomas Barns is Risk Modelling and Performance Engineering service lead at Capacitas, responsible for service definition and ensuring consistent best practice across projects. Over the past 10 years he has worked on large projects providing capacity and performance expertise to clients and also owned the roadmap for developing Capacitas’ technical software solutions. During this time, he has seen a big shift in how software engineering is undertaken and viewed by the business, and has built on this to introduce more effective and efficient performance risk management processes. This has meant a focus shifting away from large scale system testing to a full lifecycle approach, alongside research and development in automated data analysis. Thomas is currently defining and governing Performance Engineering processes and standards for a multi-million-pound multi-vendor programme of work at a FTSE 100 company.

Andrew Ayres, Director, UK&I – ROQ: "Quality Assurance – A Strategic Imperative in the Digital Age"

Leading analyst firms such as Gartner, Forrester and others have recently outlined their strategic planning assumptions and CIO priorities for the year ahead. It’s clear that, as a proportion of discretionary spending, software investments are on the rise (on a global, cross-industry basis). This should come as no surprise – software is now fuelling many aspects of our personal and professional lives and leading firms are eschewing heavy IT infrastructures in favour of flexible, cost effective digital services that are ‘fit for the future’. But it doesn’t stop there – software is the driving force behind AI, AR, Blockchain and a wide range of other emerging trends. In this session, Andrew will highlight why developing software is not enough to compete anymore – it has to work, exactly as intended, first time around – and, when it does, the positive effects are there for all to see. Software that works (amongst other things) is a catalyst for digital transformation, a driving force behind enterprise collaboration & mobility, a way to delight customers and a source of internal efficiencies and heightened business performance. Therefore, testing and quality assurance is a strategic imperative and so, the extent to which testing professionals can impact business outcomes positively has greatly increased. In this context, testing and QA professionals are the final arbiters of delivered quality, the difference between success and failure, and – given the importance of high-functioning software – become key allies to CIOs and other senior leaders alike. The days when these activities were seen as ‘post hoc, necessary evil’ should be well and truly behind us now (sadly not always the case) – it’s time for testing to evolve and inject some new world thinking to spur real digital success.

Some of the key areas (not exhaustive) to be discussed are:

  • How do we ensure our testing and QA approaches are truly fit for the future – including the following CIO level priorities relating to software quality
  • Big Data, Business Intelligence and Analytics
  • Cloud Hosted Services and Solutions
  • Enterprise Mobility
  • Enterprise-scale Applications
  • Customer-facing Platforms
  • The Internet of Things (IoT)
  • Artificial Intelligence, Augmented Reality and Virtual Reality
  • Blockchain and Distributed Ledger Technologies
  • What are the barriers and challenges that must be overcome to make this a reality?
  • What can we do to drive greater levels of strategic alignment with business stakeholders and their objectives?
  • How do we report on the business impact of QA and testing to underpin the need to shift-left and create more influence?

Andrew Ayres is Director - UK & Ireland at ROQ having previously held roles with leading information technology consultancies in the UK, Europe and the Middle East - working closely with Boards of Directors, CFOs, CIOs and their teams to make digital transformation a reality.

Isabel Evans, Independent: "Leading, following or managing? You can help your group thrive"

We work in teams. Teams that have goals, that work together to solve problems, that sometimes squabble and make up. How people in the group behave depends on the styles of leadership, management and followership adopted in the group, and on each of our individual behaviours.

Do we have a mentoring, coaching, managing or leadership role towards others? Are we following or learning from others? Do we influence our colleagues and organisations in public or behind the scenes? Are these interactions built into a formal hierarchy in our group? Have we informally adopted an interaction role? Or have we even been forced into a particular interaction role?

In order that we work together as efficiently and effectively as possible, we need to understand the range of approaches or styles for leadership and management, what styles we feel most comfortable with, and how we react to both being leaders and being led.

It’s not just humans who work in groups. Other animals can tell us about how we interact with individuals, teams, and groups both as leaders and followers. Isabel discusses animal behaviour, predators, animal groups, parasites and epiphytes, and the richness of symbiotic partnership.

Regardless of the project model/software life cycle model you use, you’ll need to understand these interactions, and when to adopt a leadership, mentoring, coaching, following or learning attitude in order to help your group thrive.

Three key points to take away:

  • Gain an understanding of leadership styles and how you react to them
  • Learn how the team organization and leadership style affects its effectiveness
  • Understand when to use these approaches most effectively and efficiently.

Isabel Evans is an independent quality and testing consultant. Isabel has more than thirty years of IT experience in the financial, communications, and software sectors. Her work focuses on quality management, software testing and user experience (UX), She encourages IT teams and customers to work together, via flexible processes designed and tailored by the teams that use them. Isabel authored Achieving Software Quality Through Teamwork and chapters in Agile Testing: How to Succeed in an eXtreme Testing Environment; The Testing Practitioner; and Foundations of Software Testing. A popular speaker and story-teller at software conferences worldwide, Isabel is a Chartered IT Professional and Fellow of the British Computer Society, Programme Secretary of the BCS SIGiST, and has been a member of software industry improvement working groups for over 20 years.

Closing Keynote: Declan O Riordan, Testing IT: "Will Risk Transference outgrow Risk Mitigation?"

As testers we have traditionally worked in the identification, monitoring, and control part of risk mitigation. Financial matters such as the CBA are rarely our concern. Questions are now being asked about the effectiveness of technology versus how the technology spending is prioritised. Even as IT budgets drop in some sectors, the growth in overall spending on cyber-security continues to grow at between 5% and 7%. Surprisingly, the really fast growth area is risk transference. In some sectors cyber-insurance spending has grown by 63% annually. What could be the reasons for budget holders shifting their risk management approach from mitigation to transference? What are the implications for Testing and Testers? The facts don’t care about your feelings, but is there anything we should be doing to defend the value of risk mitigation, or could we find a place in risk transference?

Declan O’Riordan started speaking at conferences in 2014. Within nine months he’d won the EuroSTAR prize for best conference paper and was voted the ‘do over’ session delegates would most like repeated. In 2015 Declan won the prize for best conference paper at the USA’s STAR East conference, and was on the EuroSTAR programme committee for Maastricht. In 2016 he was joint winner of the EuroSTAR prize for best conference paper, and in 2017 an organiser of UKSTAR. He now devotes energy to rolling out SecDevOps initiatives using continuous real-time test instrumentation.

The small print...

There will be no automatic right for any refund, although you may substitute the named attendee up to 24 hours prior to the date of the event. In exceptional circumstances, which must be explained by you, you may ask us to refund the booking fee. Any refund shall be at the sole discretion of Gerrard Consulting and any refund will be subject to an administration charge.