SOGETI UK BLOG

The Trouble with Twitchy Customers

Last month’s hacking of the Amazon-owned gaming company, Twitch, made big news. The events that happened next were worrying. At first Twitch sought to up security, in part by raising the number of characters required for a customer password, and then lowering it again in response to angry objections from customers who said it made logging in too difficult.

One customer from Texas went so far as to post on Twitch’s Facebook page “If users want to use a bad password, that’s their problem, not yours.”

This customer backlash against a business that is simply trying to protect its customers from security breaches raises an important question – who exactly is responsible for Cybersecurity? Is it the government’s responsibility in the laws, policies and guidelines it creates? Are businesses in the private sector, which take our credit card and personal details and store them, to be held accountable for both internal breaches and external attacks? Or is it down to us, the consumer, to choose our passwords wisely and keep our information safe? The truth is that for a security policy to be successful, everyone involved at each stage of an online transaction has to take a certain amount of responsibility and work together to achieve the common goal of protecting society from malicious hackers.

UK Cybersecurity Essentials

In the UK, businesses that want to tender for government projects must adhere to the new baseline Cybersecurity standards created by the Cyber Security Essentials 5 Key Controls, which are:

  • Secure configuration
  • Installing boundary firewalls and Internet gateways
  • Access control and administrative privilege management
  • Patch management
  • Malware protection

Although not mandatory for private sector projects, hundreds of businesses such as Action for Children, Vodafone, SproutIT and ELEXON are getting certified to show they are taking Cybersecurity seriously. So here we see the public and private sectors working together and taking responsibility to ensure both their own Cybersecurity, and ours.  As Cabinet Office Minister Francis Maude observed in a recent ComputerWeekly interview:

“While it’s right the government leads by example, we can’t do it alone. There’s no single magic bullet to neutralise the cyber threat, but the one thing common to all our efforts – whether it’s about resilience, or awareness, or capability and skills – is co-operation.

The government is now using funds from the National Cyber Security Programme to create Gov.uk Verify, which enables purely digital proof of identification with a decentralised data storage system, and is to be rolled out in the public sector with the hope that the private sector will quickly cooperate and follow suit. Mind you, the PwC 2014 Information Security Breaches Survey also found that “70% of organisations keep their worst security incident under wraps. So what’s in the news is just the tip of the iceberg” so the private sector still has some way to go before the majority of businesses can claim to be Cyber Secure and operating transparently.

Barack Obama Fights Back

As the Wall Street Journal reported a couple of weeks ago, U.S. regulators are deeming the corporate boards ultimately responsible for successful cybersecurity strategies, and even suggesting that individual directors and security officers should be held accountable and liable in the event of a breach. In January of this year, after Islamic militants allegedly hacked the U.S. Central Command Twitter and YouTube accounts, President Obama defended proposed new legislation creating a new level of corporate responsibility by saying:

“When these cyber-criminals start racking up charges on your card, it can destroy your credit rating. It can turn your life upside down. It may take you months to get your finances back in order…so this is a direct threat to the economic security of American families, and we’ve got to stop it.”

Stick to your Cybersecurity Guns!

This is all very well, but what do you do when your customers don’t want to play ball and your CSO’s job and your company’s reputation are at risk? Where your business is legally accountable for Cybersecurity and breaches put the business and individual board members at risk, it’s a good idea to dedicate a section on your website to informing your customer of the legal requirements and penalties and how the law is designed to protect them. A section explaining how your security strategy is designed to benefit them and the risks associated with a breach is also a good way of educating consumers. Similarly responding to live feedback on social media with a brief explanation of the law, the risks and the repercussions could be helpful.

Ultimately businesses need to stick to their guns and not bow down to customer complaints about increased security measures. They have a duty to all of their other customers and the nation as a whole to help stamp out attacks and breaches, and the only way to do this is for the public sector, private companies and individual consumers to collaborate and take joint responsibility. After all these customers who are so vocal about the downsides of higher security will also be the first to complain about your business if your security systems fail and cybercriminals hack into their bank account and start spending their money!

Barry Weston AUTHOR:
Barry is Sogeti's Solutions Director for Transformation, and winner of 'Testing Innovator of the Year’ at the 2013 European Software Testing Awards.

Posted in: A testers viewpoint, Behaviour Driven Development, Business Intelligence, communication, Developers, e-Commerce, privacy, Risk, Risk-based testing, Security, Technical Testing, User Experience      
Comments: 0
Tags: , , , , , , , , ,

 

“…For many online offerings which are presented or perceived as being ‘free’, personal information operates as a sort of indispensable currency used to pay for those services. As well as benefits, therefore, these growing markets pose specific risks to consumer welfare and to the rights to privacy and data protection.”

Source: Preliminary Opinion of the European Data Protection Supervisor Privacy and competitiveness in the age of big data: The interplay between data protection, competition law and consumer protection in the Digital Economy March 2014.

Loyalty & Trust

Trust is the key to achieving customer loyalty and is itself attained through transparency, honesty and a respect for the individual. In the wake of international scandals such as Wikileaks and Ed Snowden’s outing of the NSA’s practices, one of the most effective ways to gain you customers’ trust is through ensuring their data privacy. Indeed 89% of consumers surveyed in the TRUSTe 2014 U.S. Consumer Confidence Index stated that they would avoid doing business with a company they felt was not protecting their online privacy. In addition to this, the considerable legal financial penalties incurred due to a breach, the ensuing revenue loss and the resultant customer churn of around 4% (Ponemon Institute, 2013 Cost of Data Breach Study) means that ensuring your customers’ data is protected could be a key differentiator that puts you ahead of the competition. 

Legal Eagle Eye

In the UK, the Data Protection Act 1998 (DPA) defines the legal requirements for handling personal data. The 7th Principle states that “…appropriate measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data.” This begs the question, what actually constitutes personal data? Well, according to the ICO, the Statutory definition of personal data means data “non-automated records that are structured in a way which allows ready access to information about individuals. As a broad rule, we consider that a relevant filing system exists where records relating to individuals (such as personnel records) are held in a sufficiently systematic, structured way as to allow ready access to specific information about those individuals.”

Until recently the judgment in the case of Durant v Financial Services Authority (Reference [2003] EWCA Civ 1746; [2004] FSR 28; The Times, 2 Jan 2004) was the overriding precedent for more detailed guidance on what comes under the umbrella “personal data”. In this case it was deemed that “personal data” was information of “biographical significance” that had to go beyond a mere mention of an individual’s name in a matter which has no personal connotations, such as a meeting request e-mail.  Durant has often been criticised for looking only at the content of the data and not at the context and therefore giving a narrow definition of “personal data” and yielding some unhelpful and unprotective results in the cases that have followed it.  The Court of Appeal judgment in the recent case of (Edem v The Information Commissioner [2014] EWCA Civ 92) states that to define personal data under the act we must look at the context of the data to see if a person can be identified. For example if we saw the name Jeremy Hunt on a database, the name is fairly common and the person is most probably not identifiable. However, if the other names on the list are George Osborne and Theresa May or the database is named “Cabinet Ministers”, then we know what we are looking at and we know that Jeremy Hunt is the Health Secretary.  In other words a name is always personal data if the context in which it appears is sufficient to identify the named individual. How does this impact your business? Well it sets the bar for ensuring data protection higher and means that when you sit down with your lawyers to define your data privacy policy and strategy, you need to consider the context and juxtaposition of the data as well as the actual data content when determining whether or not you have fulfilled your obligations under the Act.

In 2012, the European Commission (EC) published its draft proposal for a General Data Protection Regulation (GDPR), a new pan-European Union standardised law to update and replace the Data Protection Directive of 1995, which was considered to be out of date in light of growing number of US and European data breaches.

The Snowden debacle encouraged the EU to push the GDPR forward quickly and it looked set to become law in May of this year. However the Regulation has now come under attack from external sources and more surprisingly perhaps from within the senior ranks of the EC itself! The UK is lobbying to have the Regulation either downgraded to a Directive or abolished altogether, citing the reasons for the resistance as the need for each nation to determine privacy laws based on national priorities and the possibility that the proposed restrictions will inhibit business innovation.

The Janus Effect

A key element of the GDPR is the concept of the “one-stop-shop” whereby EU citizens would be allowed to file complaints with their own data protection regulator rather than be forced to do so where the company concerned is headquartered.  In December 2013, in a volte-face that infuriated the EU Justice Commissioner, Viviane Reding, who introduced the GDPR, the head of the EC legal service (the somewhat aptly named Hubert Legal) voiced the opinion that the one-stop-shop was potentially an infringement of the European Convention on Human Rights! Ms Reding maintained that the situation had not changed from when the original decision to proceed had been agreed and that the GDPR should progress forward without looking back over old ground.

In light of these obstacles some people have said the GDPR may not come to fruition at all and others that, at the very least, it will be years before it is enacted. So what does this mean for your business? Should you simply continue with your current Data Protection strategy or should you begin to update your privacy policies now in early preparation for the Regulation?

Serious Sanctions

Well consider again the findings that data protection and privacy are crucial to your customers’ desire to do business with you. Then consider that an old 2008 report by researchers Aleecia M. McDonald and Lorrie Faith Cranor found that “Online privacy policies are so cumbersome and onerous that it would take the average person about 250 working hours to actually read the privacy policies of the websites they visit in a year”. Digitalisation and Big Data have only made data protection more complex and difficult to achieve in the last 6 years and there have been few major amendments to the DPA, save refinements made by the Privacy and Electronic Communications (EC Directive) Regulations 2003, which altered the consent requirement for electronic marketing to “positive consent” such as an opt in box rather than an opt out. All this adds up to the situation whereby a lot of companies’ privacy policies are outdated, over complex and not working to win the trust of their customers and prospects.

If the prospect of winning more customers isn’t a sufficient incentive to start preparing for the GDPR then perhaps the heavy sanctions will be! The penalty for a breach of the GDPR will be a whopping €1 million or up to 2% of your global turnover! Perhaps David Smith, Deputy Commissioner at the Information Commissioner’s Office (ICO) put it best in his speech at Infosecurity Europe 2014 when he said he expected the GDPR to be enacted at 2017 at the earliest but advised, “Get your house in order now, under the current law, to ensure you are ready for the coming changes, because the principles are not very different.”

10 Ways to Prepare for the GDPR

1. Do a full audit, take a complete inventory of all your data and create a map of data usage.

2. Ensure that your new strategy is designed around the concept of obtaining explicit consent for all personal data usage and lifecycle.

3. Create a solid data breach system with clear processes and procedures in the event of an unavoidable or accidental breach.

4. Ensure that data loss reporting is fast and thorough as this will become mandatory.

5. Get the whole Board involved and create a culture of privacy and protection so that it is embedded in every part of your business and every member of staff understands its importance from the board room to the post room.

6. Appoint a Data Protection Officer either part time or full time depending on your business requirements, quantity of data, data usage and data testing.

7. Choose a framework that suits your business such as ISO, NIST, or COBIT.

8. Monitor your new system and utilise comprehensive reporting and adjust it accordingly so it works for your business and your customers.

9. Rewrite your Privacy Policy and make it accessible on your website so that it is user friendly and your customers can find it and easily understand it.

10. Ensure that you are using data obfuscation and data encryption & decryption at every stage in your test environments in order to maintain integrity, privacy and data protection.

Here’s the key question to ask yourself at each stage: “Is it reasonable to assume that a member of the public would expect their data to be used in this way?”

Managing Privacy in a Test Environment

Businesses should not rush products and services to market without thorough testing and they should listen to their privacy advisors before giving into pressures from the marketing department.” So says David Smith of the ICO and he is right, as for example, a failure to test a fix in a test environment could result in errors  being introduced into the live environment, which could themselves result in a breach of the DPA. However testing itself creates a variety of scenarios where a breach of data privacy is possible; so how should you manage your data in a test environment?

It’s essential to ensure that you extract only the data required for testing and then employ a variety of data obfuscation techniques such as data substitution, number variance, gibberish generation, masking data and synthetic data, in conjunction with encryption. This keeps the data realistic and testable but hides sensitive data from internal staff like application developers and testers. If obfuscated data is lost it could be read by a non-authorised user but they would not be able to ascertain the details of any individual so a breach would be avoided. Your chosen data obfuscation strategy needs to be carefully evaluated to make sure that the obfuscated data is still suitable for testing, to establish how impenetrable the scrambled data is if under attack and to determine how much the strategy will cost. For example if you’re testing an application that requires data validation, data substitution may be a simpler, faster and more cost effective means of obfuscation than creating synthetic data.

Born Ready

So the truth is that many existing privacy policies aren’t truly in line with the current law let alone prepared for the GPDR or designed to promote customer loyalty. Businesses that are not already updating their privacy strategy and making their policies more customer-friendly are missing an opportunity to differentiate themselves in a way that customers currently deem to be very attractive. Similarly investors, venture capitalists and angels are more inclined to invest in a business with an outstanding privacy policy as it significantly reduces a large proportion of risk.  Another major advantage of revising your Privacy Policy now is that you can spread the work and the cost of compliance over the next 3 years before the GDPR comes into play. The bottom line is that the  potential sanctions for non-compliance with the Regulation are so severe that it makes sense to ensure your privacy policy is up to scratch in the next 3 years, and the benefits of improving your privacy policy are so great, that regardless of whether the Regulation comes into force or not, it’s a worthwhile undertaking.

Peace of Mind

The importance of testing increases in parallel with the ever rising expectations of your customers. In light of the complexities of Data Protection and the potential changes to the law, we’ve seen that it’s essential that your test environments are secure. Outsourcing your data testing to a business in which testing is the core competency is a sensible way to ensure speedy, efficient and secure testing with the right level of encryption and obfuscation to give you total peace of mind. Sogeti offers a complete end to end Test Data Management (TDM) Service that:

– Analyzes organizations’ current software testing and test data management.

– Proposes what actions and toolsets are needed to improve testing.

– Helps customers choose the right testing tools.

– Offers a pilot or proof of concept to show that the selected tools can deliver the test data required and that the proposed process can deliver the expected benefits.

– Provides a full TDM rollout.

– Supports and trains customers all the way through the process and even after the rollout.

– Ensures that the number and size of the test environments are precisely what is required by introducing a smart solution to ensure that the right data is made available for testing.

With our forward-thinking, comprehensive TDM service, we can help you ensure you are delivering quality and value to your customers while conforming to the existing and impending legislation.

Barry Weston AUTHOR:
Barry is Sogeti's Solutions Director for Transformation, and winner of 'Testing Innovator of the Year’ at the 2013 European Software Testing Awards.

Posted in: Big data, End to end testing, IT strategy, Open Data, privacy, Quality Assurance, Requirements, Software testing, test data management, Test environment, Test Environment Management      
Comments: 0
Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

 

Organizations start test process improvement programmes for a number of reasons:

* Curiosity about their current level of testing maturity
* A desire to achieve a desired level of maturity
* To fix actual or perceived shortfalls in testing capability.

The activity is often initiated by the manager of the testing function with a view to “dipping their toe in the water” and then making a decision on how to proceed with any improvement initiatives or programmes once the outcome of the initial maturity assessment is known.  This can be done for a modest financial investment.  If understanding maturity is the purpose of the assessment is the desired outcome of the assessment then mission accomplished.  If however, as is usually the case, the purpose is to kick off a process improvement programme then further investment will be required in terms of people and funding.

I said earlier that many assessments are initiated by testing managers. Where this is the case there are often difficulties in implementing any required change as the improvements required are often outside the core testing purview, requiring changes to be made in other areas of the overall development process.  Sogeti’s PointZero approach recognises this difficulty and identifies that senior level sponsorship (outside of testing) is required to drive change within the organization.  So it is probably best to think of test process initiatives as business process change focused on testing and build a guiding coalition of staff at senior and grass roots levels to implement change and to quickly address any difficulties that arise throughout the process. John Kotter in his book “Our Iceberg is Melting” identifies that this guiding coalition and developing a sense of urgency around the change programme are essential to implementing any successful change programme. 

What this really means is that any organization looking to embark on a significant testing change programme rather than tactical incremental improvement will need buy-in at all levels and across many disciplines within the business ranging from procurement, development, business analysis and project management.  Where this hasn’t been achieved the speed of change is slower, the failure rate of improvements is far higher and the financial investment to support the programme is often missing or harder to identify.  Sogeti’s experience in delivering assessment and managing the subsequent improvement programmes can mitigate these problems through the identification of real Return in Investment for implementing change as well as managing the change process through PointZero to establish senior level sponsorship to support the programme across the business.  Sogeti uses its TPI Next® assessment model to assess test process maturity and identify and prioritize the most important areas to address.

Barry Weston AUTHOR:
Barry is Sogeti's Solutions Director for Transformation, and winner of 'Testing Innovator of the Year’ at the 2013 European Software Testing Awards.

Posted in: A testers viewpoint, Opinion, Publications, Software testing, test framework, Test Methodologies, Testing and innovation, Transformation, Transitioning      
Comments: 0
Tags: , , , , , , , , ,

 

As Solutions Director for Transformation here at Sogeti, I was tasked this week with delivering a webinar on the hot topic of ‘Discovering a more effective route to Test Transformation’ – looking at going beyond the traditional time and materials and offshore models to really make a difference to the business as a whole.

I’ve been with Sogeti for around 6 years now, having been focussed on delivering strategic projects and leading the process improvement practice, and now working with a large mobile telco on their transformation project.

We have found that there are many synergies in process maturity and test transformation; both take you on a journey from where you are (whatever the level of your process maturity) to a changed state where processes, delivery and supporting functions move into an innovative space, where significant reductions are seen in the cost of test delivery, defect leakage and time taken to deliver.

The webinar focusses on how you too can achieve the real world benefits we’ve seen where we’re already using Test Transformation with clients, including reductions in test costs of 30-40% and reduction in delivery time of up to 15%, as well as defect leakages of less than 2% (significantly better than industry estimates which can be as high as 50%, which not only benefit the initial solution, but also further on down the line as processes are maintained.

If you listen to the webinar I discuss the roadmap to test transformation; from benchmarking the current function, through moving to a dedicated in-house test capability with standardised processes and ensuring skilled resources are in the right place, to automation and then the final (and maybe most important) stage of driving defect discovery as close to the start of the project as possible.

Transformation encourages businesses to move from defect detection to prevention – supporting collaboration amongst different groups early on in the development lifecycle, and looking to the root cause of problems rather than just fixing the top level. This leads to continuous improvement, and through Test Transformation we eventually drive defect detection right to the beginning of the delivery process – a little something we call our PointZero Vision.

The webinar notes the key areas for focus, and gives a real world example of results achieved. Some of these include:
– A Year-on-Year reduction of testing costs, underwritten by Sogeti, so your risk of programme failure is mitigated.
– Better test processes – without them it is unlikely any improvement in efficiency/ reduction in cost would be achieved.
– Making your own organisation better at testing, and helping you to achieve cheaper test delivery with same quality assured.

The last key area to mention is that Transformation is a change programme – It isn’t just a case of evolving the systems, it needs backing and engagement from key stakeholders.

The webinar is available to view here.  While I am very happy to respond to questions, please do get in touch if you would prefer a more in-depth discussion.

Barry Weston AUTHOR:
Barry is Sogeti's Solutions Director for Transformation, and winner of 'Testing Innovator of the Year’ at the 2013 European Software Testing Awards.

Posted in: End to end testing, Point Zero, Requirements, Risk-based testing, SDLC, Software testing, Test Automation, Test Methodologies, Testing and innovation, Transformation, Webinars      
Comments: 0
Tags: , , , , , , , , , , , , , , , , ,

 

As a leading provider of IT testing and Quality Assurance services, Sogeti is extremely familiar with the term ‘debugging’. Likewise, the commercial corporations and public sector organisations that understand the value of long-term strategic testing programmes and trust us to deliver the very best available solutions, frequently talk about ‘bugs’ when discussing their testing requirements.

But what do we really mean when we use the terms ‘bug’ and ‘debugging’, and who was responsible for making it part of every modern software tester’s vocabulary?

A ‘bug’ is a common way of referring to defects in either software or hardware systems. Debugging refers to the process of finding those bugs and eliminating them in order to ensure the systems act only as intended. Complex modern software with numerous subsystems and integrated applications can present some very tricky debugging challenges, and in many cases the act of debugging itself can even create subsequent bugs in other areas of the holistic system. Fortunately, as a specialist in the field, these are exactly the breed of challenges we relish at Sogeti.

So when did the earliest test specialist start using the term ‘bug’? Well, there are arguments that suggest that the term ‘bug’, referring to a technical error, dates back to Thomas Edison in the 1870s. The Oxford English Dictionary definition for ‘debug’ also suggests that the word predates computing and was used in the Journal of the Royal Aeronautical Society in 1945. But by far the most compelling and attractive anecdote of the origins of the modern computer ‘bug’ attributes the terminology to Rear Admiral Grace Hopper.

Born in 1906, Grace was a US Navy officer and pioneering computer scientist who became one of the first programmers and language compilers for the Harvard Mark I computer. Her concepts for independent programming languages directly led to the development of COBOL, and during the end of her wartime service she also happened upon the circumstances that have led many to credit her with coining the terms ‘bug’ and ‘debugging’. While working on the Mark II Harvard computer, Grace and her colleagues found a moth lodged inside the machine, preventing the operation of a relay and stopping the computer from working as intended. Crucially, she recorded the event, stuck the actual moth into her thorough logbook (see above), and noted that her team had been “debugging” the system.

Whenever the true first use of these terms really was, like the moth she discovered, it is Grace Hopper’s story that has stuck and will always be cited by modern software testers. She never claimed to have invented the terms, but she certainly popularised them. Computer programmers adopted the ‘bug’ in the 50s, and by the 60s ‘debugging’ was common programming lab terminology.  The rest, as they say, is history.

 

Barry Weston AUTHOR:
Barry is Sogeti's Solutions Director for Transformation, and winner of 'Testing Innovator of the Year’ at the 2013 European Software Testing Awards.

Posted in: Software testing      
Comments: 0
Tags: , ,