The NIS Directive

With a reputation as a major competitive differentiator, companies in the banking and financial services (FS) sectors have historically kept internal cyber-security breaches and external attacks close to their chests. The (currently draft) European Network and Infrastructure Directive (NISD) is set to change all that with its requirement that “operators of essential services” including businesses in the energy, telecoms, banking, health, transport and financial services sectors, “take appropriate security measures” to prevent these incidents and when they occur, and report them to designated Computer Security Incident Response Teams (CSIRTs) across Europe.

Once the European Parliament has approved the Directive, countries in the European Union will have 21 months to pass their own laws to implement it. Member States will need to establish a national Network Information Security (NIS) strategy; regulatory measures to ensure network security; create a competent authority to monitor NISD; and create a Computer Emergency Response Team (CERT) that handles incidents and risks. NISD requires collaboration and information sharing and when an incident occurs in more than one EU state, the relevant CSIRTs will inform the other affected Member States and come together to create a solution.

So, just how will NISID affect the Insurance industry? And how can we redesign our test strategies to avoid cyber-security disasters and ensure compliance with the new legislation?


The Impact

It’s difficult to judge the impact at this stage as the Directive is still in draft form, but it seems unlikely that cyber-security breaches will be made public knowledge, which is good news as publicity of a breach would erode customer trust and loyalty. Some firms who operate on an international level may not find the requirements arduous at all, as they will already be subject to similar legal requirements in countries such as Singapore. The Telecoms sector will also be less impacted as Telco companies are already subject to incident reporting requirements under the EU Framework Directive.

For other Insurance companies however, these involuntary requirements are newer territory and they need to prepare in advance and ensure that they have a clear cyber-security prevention, detection, response and reporting strategy and additional resources in place to implement it.

The Directive is also likely to be a catalyst for growth in the cyber-insurance industry, with PwC predicting the global market could grow to $5bn in annual premiums by 2018 and at least $7.5bn by the end of the decade.  As insurers create more stringent rules to determine which businesses are an insurable risk, organisations in other sectors will be pushed to improve their own cyber-security and test strategies in order to get insurance coverage. There is concern in other industries that cyber-security insurance policies are becoming too onerous with tight restriction and pricing. This paves the way for a disruptive influence to swoop in and corner the market with a better offering so insurers need to take action to demonstrate their value. The key here is for insurance companies to lead by example, ensuring that their own security and security testing strategy are geared up for maximum threat detection and protection and to use new technology and the IoT to analyse the potential risks in other industries more effectively to enable better pricing models.


The Security Testing Strategy

The Financial Services industry already allocates 37% of their total IT budget to QA and Testing (World Quality Report 2015-16) and, with 88% identifying prioritising security as essential to success (WQR), it seems likely that, in light of NISD, a higher proportion of this will now be directed to improving their Security test strategies. Although the Financial Services sector have cautiously increased cloud-based testing in the last 2 years, the increasing emphasis on security brought about by NISD, is likely to cause further concerns about Cloud privacy and security giving rise to greater adoption of hybrid solutions. The vast majority of Insurance companies already undertake Penetration testing but for most this is as infrequent as once a year or quarterly with only a very small number testing regularly on a monthly basis. Insurers will need to step up their Penetration testing in order to meet the new NISD requirements and avoid a security breach or attack. We’re also likely to see the Financial Services sector being one of the first to adopt widespread Entitlement testing, a new test approach to secure data throughout the application development lifecycle. Entitlement testing enables management to determine who is accessing data and for what reason and to restrict the ability to access and manipulate data only to those who genuinely require it.

So, from a testing perspective, NSID is likely to bring greater maturity to insurance companies more quickly and even bring them to the forefront of pioneering new testing methods in a bid to win the war on cyber-security.

James Higgins AUTHOR:
James works in new business development at Sogeti, opening doors, facilitating introductions to key decision makers and building relationships with future prospects.

Posted in: Application Lifecycle Management, Cloud, Digital strategy, Innovation, Internet of Things, IT strategy, Quality Assurance, Research, Security, Software Development, Software testing, Test environment, Testing and innovation, Transformation, User Experience      
Comments: 0
Tags: , , , , , , , ,


Triple Impact

The Internet of Things (IoT) is having a triple impact on the insurance industry as customers begin to incorporate smart devices into their daily lives and the insurers themselves begin to build IoT ecosystems to capture data will transform their policies’ value proposition and also seek to deliver IoT-linked insurance services. Connected car solutions are most prevalent and 60% of top European insurers have already created offerings in this area with the UK and Italy leading the field. With the IoT bringing Customer Experience (CX) to the fore, the World Insurance Report 2016 shows that insurers need to bring their thinking more in line with customer expectations as, for example, only 16.3% insurers think driverless cars will increase in popularity whereas 23.1% of their customers are keen to explore this trend and 48.9% of insurers think their customers will be happy to make use of wearables while only 30.1% of customers agree. So what are the IoT technologies that are transforming the insurance industry? How will these innovations affect CX and B2C policies? And what will our test strategies need to look like to ensure quality and protect brand reputation?

A New Era of Innovation

IoT technologies will have the most impact on car, home, manufacturing and health insurance. Car insurers are already using Geospacial applications such as NationWide SmartRide and Progressive Snapshot to collate customer auto data showing the speed, distance turning and braking patterns and general driving behaviour of their customers. Sensors are set to be big business in our homes and in offices, warehouses and factories, detecting and alerting insurers in real time of dangers such as air pollutants and natural disasters like earthquakes. Sensors are also starting to be built into toys and consumer electronics to enable insurers to predict potential product flaws and breakdowns and meet extended warranty promises faster and more effectively. Wearables offer insurers real time insights into areas such as employee health and disability claim compliance. All of this data of course needs to be gathered, stored, processed, analysed and changed into actionable insights and insurers will need to be aware of the skills gap and upskill by hiring and partnering with experts in relevant and more mature industries such as Automobile and Telecoms.

The IoT Insurance Model

All of these changes will hugely impact the way in which insurers interact with their customers as they build a more personalised picture of each individual customer and we head towards a new era of connectivity: “The Internet of Me”.   As the need for risk estimation decreases and customers are encouraged to take risk management into their own hands and modify their behaviour accordingly, insurers will have to find a new way to calculate policies so, in the future, we may see a completely new insurance model based around the concept of prevention rather than risk assessment.

IoT Test Strategy

The Internet of Things and the Internet of Me will require insurers to adopt a new culture of consolidation, convergence and collaboration and an unprecedented heightened focus on the end-to-end customer experience (CX). This in turn requires a complex and comprehensive quality assurance and test strategy with a shift left approach. We are already seeing the financial impact of this as the Financial Services industry now allocate 37% of their total IT budget to QA and testing; an 11% increase from 2014.

In terms of test strategy, in his new book IoTMap – Testing in an IoT Environment, Sogeti’s Tom van de Ven explains that there are 3 fundamental principles for success. Firstly, testing the IoT is all about CX; secondly we need to move away from functional unit and regression testing to non-functional quality attributes focused on interoperability, performance, security, recovery and usability; and thirdly crowd testing and storytelling will become of paramount importance. If we look just at telematics for a moment, our test strategy will need to include Cybersecurity testing, software testing and validation, audio and speaker testing, Bluetooth testing and Mobile application testing with strict measures in place to protect data privacy and security in the test environment. A further consideration is that all of this needs to be brought into line with strict regulations.

The biggest challenge when creating an IoT test strategy for insurers is that we are testing physical things connected to many other devices, comprised of component parts from a variety of vendors with different security measures, in an area that currently has no proper standardisation, for an industry that is highly regulated!

In light of the challenges of IoT testing the new insurance test strategy needs careful planning and Tom van de Ven’s new 5-step approach provides an excellent framework:

  1. Take a layered approach to testing considering first the “Thing’s” sensors, hardware and mechatronics, what it is connected to and how it communicates, then the application data platform layer and finally the interface with the end user.
  2. Determine how to combine the layers and test the IoT solution end-to-end.
  3. Map out the required IoT test environments in terms of extreme conditions and level of connectivity.
  4. Determine the right quality attributes with a focus on non-functional testing – for example with a particular device usability might be more important than functionality.
  5. Implement the building blocks that enable the first 4 steps such as crowd testing and storytelling, automation and cutting edge methods such as AI and entitlement testing.

In a recent interview with CIO online Farmers insurers’ CIO Ron Guerrier acknowledged that the only way to approach IoT in the industry is to work with dedicated partners who already have a deep understanding of the insurance industry’s regulatory requirements and who can offer 80% ready-to-go technology and 20% that can be tailored to the particular needs of the individual insurance company.

Redefining the Industry

All of the findings in these recent reports, surveys and interviews point to one thing: the insurance industry is preparing for a new era of unprecedented innovation, leading the field in the adoption of IoT technology and redefining policies to align with the changing behaviour of business and individual customers to create a customer experience that will reshape the industry forever.

James Higgins AUTHOR:
James works in new business development at Sogeti, opening doors, facilitating introductions to key decision makers and building relationships with future prospects.

Posted in: Automation Testing, Data structure, Digital, Digital strategy, Innovation, Internet of Things, IT strategy, mobile applications, Quality Assurance, Requirements, Research, Security, Smart, Software Development, Software testing, Technical Testing, test data management, Testing and innovation, Transformation, User Experience, Wearable technology      
Comments: 0
Tags: , , , , , , , , , , , , , , , ,


So it was that time again, Paul Gerrard Consulting’s Test Management Summit 2013 – the 7th annual TMS event.  Being early February, it was hideously cold outside, but Pall Mall was a great location and the Institute of Directors provided a fitting venue. 

I am a relative new-comer to testing, and although the first steps are the steepest I have slowly been gathering and exploring ideas and methodologies and trying to get up to speed as fast as possible.  Fortunately for me, Sogeti are invested in people and have helped and supported throughout the journey so far – anything that makes the boat go faster shouldn’t be sniffed at!  It is this support which has allowed me to attend events such as the Test Management Summit, and this 2 day event provided a great opportunity for me to showcase Sogeti’s offerings, whilst investigating a number new tools, new vendors and new faces. 

I attended on Wednesday 6th Feb, the second day of the Summit, which consisted of a busy line up of 16 discussion groups, multiple speakers and multiple rooms occupying most of the first floor.  With 100 people expected to attend I expected the event to be fruitful and educational

At the event we primarily exhibited and spoke about Sogeti’s Testing Services, although the event seemed a little dominated by Tool Vendors. However, this wasn’t necessarily a bad thing; to me they all have opinions that matter and clients who, if wanting to fully utilise some of the tools and ideas being touted, NEED an organization like Sogeti to really bring home the savings, efficiencies and gains that the vendors claim.

A particular area of interest for me was Requirements Gathering, and how important it is to introduce a test head at this early stage in order to prevent costly and time consuming bugs and delays later on in the development lifecycle.  This topic was highlighted in the very first session from Michal Janczur of Deutsche Bank, who spoke about the importance of testing early on in his speaker slot. In fact, ‘Working to Improve Quality of Requirements’ was then the central theme during the Keynote slot held by Paul Gerrard himself. 

It is interesting that this subject, along with mobile test automation, dominated most of the discussions I had at the summit – these areas are not new even to my untrained brain.  Sogeti have been preaching these ideas for a while now, and we have structured and proven approaches to both areas. These industrialised methods enabled some good conversations to be had around our off shore Mobile Test Centre of Excellence and our brand new publication ‘Point Zero – Shift Left’ which presents our approach to introducing testing into the lifecycle very early on, in order to prevent costly defects further on down the line and allow time and energy to be better spent elsewhere.

Overall the event was good in terms of vendor awareness, and I found it really useful to learn about the new tools being launched into the market. The summit also allowed me to identify or, more appropriately, confirm the major pain points organisations are feeling right now in their delivery of IT applications and services to the business, which will allow me to better-focus my conversations.

James Higgins AUTHOR:
James works in new business development at Sogeti, opening doors, facilitating introductions to key decision makers and building relationships with future prospects.

Posted in: Events, mobile applications, mobile testing, Mobility, Opinion, Point Zero, Shift Left, Software testing      
Comments: 0
Tags: , , , , , , , , , , , , , , , ,


It’s not uncommon these days to encounter businesses that have adopted a preferred supplier list, or PSL, particularly in the public sector. Larger organisations especially, tend to use PSLs in their procurement departments to ensure that there is always a range of specialist vendors in place who have either previously worked with the business successfully, or have already undergone a thorough assessment and review process.

There are many reasons why an organisation might choose to implement a preferred supplier list, usually related to streamlining procurement overheads, and this method of selecting service providers is certainly on the increase. But do they sometimes hinder as well as help technology progression especially for those organisations forced to use them too rigidly?

The number one reason cited for going down the PSL route is cost reduction. Procurement departments claim that by focusing on just a select few suppliers, upfront costs are greatly reduced. Certainly that is technically true – agreeing to only use one provider for a predetermined period of time would logically save time, money and effort in researching, vetting and recruiting new suppliers. A PSL might also create a feel of stability and continuity in supply – those with a PSL appear to be in control. But unfortunately, the ways in which these ‘go to’ lists are compiled often breed problems.

Let’s take a look at testing as a good example. Imagine that an organisation is looking to select a partner to help deliver a number of projects, some more developed than others, over the next five years. Technology innovation waits for nobody, so you might reasonably expect that organisation would witness some major shake-ups and changes during that five year period.

Is it really very wise for a modern enterprise to restrict itself to just one or two suppliers? Who decides who goes on the list? Will it be individuals in HR and procurement teams? Should those people then be removed from core business needs so that they have the space to really assess and decide on what’s best?  These individuals are, in general, too far removed from project delivery to understand or determine the selection criteria in isolation. It may be wise, therefore, to have representation from the affected parts of the organisation (such as the development or test department)

Irrespective of development / test involvement in the selection process, I believe that such departments have too much inherent interest and emphasis on the raw costs of any given service, to truly see the overall value that they each deliver. It is too difficult to consider and measure the value of specific suppliers in relations to a particular project by relying on a PSL alone. For this reason there should always be room for manoeuvre within any list.

If a company wants to move forward with technological progress, then from time to time it will certainly require greater agility coupled with the ability to select the best supplier for the job at hand, not just the best on a PSL. Those with rigid lists need to seriously examine the true value they are really getting out of it. After all, business is not always about saving a pound, but also about making a pound, and doing so efficiently. With the right suppliers available to you whenever you need them without obstructive restrictions, it far easier to achieve and deliver success projects consistently.

James Higgins AUTHOR:
James works in new business development at Sogeti, opening doors, facilitating introductions to key decision makers and building relationships with future prospects.

Posted in: Opinion, Software testing      
Comments: 0
Tags: , ,


Last week Sogeti hosted TestExpo Autumn 2011.  The event was held at the Queen Elizabeth Conference Centre and went off without a hitch. It was great to reconnect with so many of our industry partners, peers and customers. The event was, as TestExpo always is, collaborative, informative and engaging, focused on sharing ideas through keynotes, vendor presentations, exhibitors and the Innovation Panel. Over the next week or two we will be posting more from the event including video interviews, an overview and audio recording from the Innovation Panel, and the results of the software testing survey we conducted at the event.

Throughout the event, our Twitter feed was busy. (If you’re not already, it’s worth following the @TestExpoUK to catch follow-up announcements.)Specific concepts and ideas discussed at TestExpo include: ‘who should get credit, testers or whoever created the test? What innovation really is about, and the relationship between developers and testers.

TestExpo wouldn’t be what it is without those who support it. We want to thank all our fellow peers: the sponsors, exhibitors, Professional Tester magazine, the official media sponsor of the event, and, of course, all of this year’s attendees.We started TestExpo as a way for Sogeti to contribute to the testing community through learning. Not only have we expanded that learning experience, we’ve created one of the best networking events for software testing and Quality Assurance. It couldn’t be done without everyone’s support.

The videos, survey results and innovation panel recording will all be posted up here and on the Sogeti site over the next few weeks, so be sure to keep an eye open. Once again, thanks for a wonderful TestExpo Autumn!

James Higgins AUTHOR:
James works in new business development at Sogeti, opening doors, facilitating introductions to key decision makers and building relationships with future prospects.

Posted in: Test Expo      
Comments: 0
Tags: , , , ,