As my colleague Jimmy Alevizos blogged earlier this month, our recent Global Testing Community Meeting in Mumbai brought together more than 100 test leaders from across the Capgemini and Sogeti global footprint.

Apart from the workshops and trips around the really interesting Testing Labs in Mumbai, the main focus of the event was to announce and congratulate the winners of our annual Testing Innovation Awards and showcase the finalists’ inventive thinking.

Our Innovation Awards are designed to recognise and reward the creativity, ingenuity, expertise and contributions of testing individuals and teams within the Capgemini Group, which comprises Capgemini and Sogeti. They are the brainchild of Nijs Blokland, the head of our testing community.  This year there was a fantastic response to the awards, with over 80 entries submitted from across the group – including submissions from Brazil to Australia, the US to the UK, and, of several from our hosts in India.

The innovations fall into broad two camps: the first set are more theoretical and conceptual, while the others reflect and recognise innovations developed for specific clients, and which can be leveraged in another context or client situation.

This year it was interesting to see how mobile applications are dominating many of our testers’ thinking. This reflects how the market is developing and how our clients’ priorities are increasingly focused on ‘being on the move’.  But the shortlist also included submissions such as ‘Agile Test Traceability on the Cheap’ and ‘Testing without a Test Environment’, as well as bespoke tools developed for SAP testing, Test Case point estimation, and TDM.

The individual winner was from Belgium with a Framework for HP QuickTest Professional Test Automation that promotes reuse, modularisation and scalability and applies solid software engineering practices to the discipline of test automation, enabling test automation engineers to get maximum ROI for their work.

The Group Award went to an international team from the Netherlands and India, that had developed a specific Mobile App Test Center to overcome the challenges of testing mobile apps across different devices, while still using our standardised test process (TMap) and our new ‘Cover’ tool to automatically create test cases.

Parallel presentations from the finalists and runners up gave us all a real taste for the talent that can be found in our testing teams around the world. This continued innovation is essential if the Capgemini Group is to retain its high profile analyst rankings!


Posted in: A testers viewpoint, Capgemini Group, Opinion, Software testing, Software testing news, Sogeti Awards, Sogeti events, Testing and innovation, TMap      
Comments: 0
Tags: , , , , , , , , , ,


The Capgemini Group has topped Ovum’s 2011 benchmarking study of Outsourced Testing, for its “world-class testing service” – ranking above many other world-leading technology service providers. The combined Testing Practice of Capgemini and Sogeti was recognised by Ovum for its customer intimacy and responsiveness as well as its test process expertise.

The Ovum Services Guide: Outsourced Testing’ benchmark study is based on 20 key criteria ranging from cost and value, service portfolio, domain expertise, innovation and talent pool. Ovum benchmarked 13 software and systems testing services providers, ranking Capgemini Group as the best. Ovum noted Capgemini Group’s structured approach to testing through Sogeti’s Test Management Approach (TMap®) andTest Process Improvement (TPI®) methodologies, as well as its collaborative approach with both customers at an operational level, to agree Service Level Agreements and Key Performance Indicators that would ensure combined decision making and shared accountability.

With more than 115,000 people in 40 countries, The Capgemini Group is one of the world’s foremost providers of consulting, technology and outsourcing services. The Group reported 2010 global revenues of EUR 8.7 billion. Sogeti, its wholly-owned subsidiary, is a leading provider of local professional services, bringing together more than 20,000 professionals in 15 countries and is present in over 100 locations in Europe, the US and India. Together, Capgemini and Sogeti have created one of the largest dedicated testing practices in the world, with over 8,200 test professionals and a further 12,500 application specialists, notably through a common centre of excellence with testing specialists developed in India.

Be sure to read the full press release here.



Posted in: Capgemini Group, Software testing, Software testing news, Sogeti in the press, Testing and innovation      
Comments: 0
Tags: , , , , ,


Tickets for the London 2012 Olympics went on resale recently, reigniting the hopes of many would-be spectators who missed out on the first-round ticketing lottery. With so much disappointment and criticism over ticketing arrangement during the original sale, you’d think that the games’ organisers LOCOG (London Organising Committee of the Olympic and Paralympic Games) would have ensured a bullet-proof system for the resale, but it would appear not to be. Instead the London 2012 ticket resale website, run by Ticketmaster, was suspended on the day of launch.

Potential customers trying to use the official site to buy returned Olympic and Paralympic tickets before it was taken down discovered that the system was too slow to update ticket availability, resulting in baskets full of tickets being emptied at the checkout because they had been snapped up by other punters during deliberations. A statement from Ticketmaster was less than adequate for fans now feeling doubly deprived of their Olympic experience: “Ticketmaster has been working with LOCOG to identify and resolve the issues experienced last Friday, when the London 2012 resale platform was launched. Further updates will be provided as soon as possible. We want buying and selling Olympic and Paralympic tickets through Ticketmaster to be a good customer experience, so we will reopen the site once Ticketmaster has resolved those issues.”

The public may have shown more good will towards LOCOG and continued to trust in its ability to fairly distribute tickets from functioning websites, if this had been an isolated issue. A previous Ticketmaster resale opportunity back in June 2011 was similarly chaotic, coming under criticism for underestimating volumes of traffic and crashing under load. So surely all transactional services would have been tested and retested by the organisers to eliminate any problems? Well, back in June, Michael Allen, Director of IT Service Management for IT delivery company Compuware voiced his concerns to Computerworld: “The organisers no doubt ran tests to check the site could handle the traffic. It did slow down and that did frustrate a lot of visitors but it didn’t crash…”

But even with so much predictability and lead-time built into the Olympic calendar, how thorough did that testing really prove to be? LOCOG is certainly saying the right things, so how could a system that must have gone through an extensive software Performance Testing and quality assurance process have stumbled so badly at the final hurdle? A high-profile service such as the Olympic ticketing platform demands thorough end-to-end, discrete and load testing of all core systems and applications across every conceivable device, so what went wrong?

According to our Sogeti UK performance expert, Richard Badger, Performance Testing will always involve a compromise between cost and risk, as the cost associated with providing representative test environments, test data and test tools etc. is usually very high. A business-driven, risk-based methodology would normally be used for structured testing to identify and address the key issues of quality, performance, scalability and cost across the whole lifecycle of solution delivery. An on-going process improvement initiative would also help address key test process challenges, focusing on both functional and performance goals and realising a higher level of test maturity to achieve a more streamlined and efficient holistic approach to testing that could be followed and developed throughout the duration of the games and for subsequent events.

In this case the assumption would have to be that that solution delivery processes did not enable the correct level of risk to be determined. There are a wide range of typical failure points. These can include: the failure to identify or give sufficient weight to all of the risk areas; the testing may have given a false impression that the risk had been fully mitigated; the testing was based on assumptions that were inaccurate and the situation in live was significantly different to that tested; or even that the system was implemented with known risks but the likelihood of these becoming issues was incorrectly perceived to be low.

Slowly some tickets are now becoming available, mainly for Paralympic events and the football competition, but we’re getting closer every day to the planned 3 February cut-off date for this round of sales. The Olympic motto is ‘Citius, Altius, Fortius’, or ‘Swifter, Higher, Stronger’. At Sogeti, we now wish LOCOG a swifter resolution to its current problems, hope it will make robust testing a far higher priority in the future, and trust that all future services and applications it allows partner Ticketmaster to manage will be exponentially stronger than the ones the British public have been made to suffer to date.


Posted in: Performance testing, Software testing, Software testing news      
Comments: 0
Tags: , , , , , , , ,