SOGETI UK BLOG

201306Security-Intelligence-IBM-Security-Blog.jpgIn 2012, there were 1,502 documented incidents resulting in loss of personally identifiable information, almost a 40% increase over the previous year’s 1,088 event count. In the last three years, 21 million patients in the United States have had their medical records exposed in data breaches.

Data leaks are becoming a common occurrence, exposing personal details such as email addresses, passwords (both encrypted and clear text), and even national ID numbers. The IBM X-Force 2012 Annual Trend and Risk Report calls for tighter security controls and policies in the healthcare industry.

With healthcare providers and payers alike trying to retain customers through better quality care and comply with an ever increasing corpus of regulations, patient records are the currency in trade, and as such must be protected with all due care. Moving to electronic health records (EHR) is a must for organisations to be able to share data between providers—even competing facilities—insurance companies, and the patient consumer themselves. Creating a record once and enriching it over the lifespan of the patient by all caregivers and payers involved holds the promise to reduce costs and improve outcomes. In addition, the U.S. government provides financial incentives for the meaningful use of EHR through the American Recovery and Reinvestment Act’s (ARRA) HITECH provisions.

Yet converting records to electronic format makes them convenient to steal en masse if not properly protected. The outcome of EHR theft include brand reputation damage in a competing market and financial penalties for non-compliance.

Here are some of the fundamental security controls healthcare organizations must undertake in order to safeguard patient data:

  • Discover all EHR and personally identifiable information (PII)
  • Encrypt or mask EHR and PII at rest and in transit
  • Impose and manage role-based access, coupled with central and/or federal authentication, to EHR and PII
  • Contract with Business Associates (BA) who have access to EHR and PII to ensure they are held to the same data protection standards, and audit them regularly
  • Protect the infrastructure housing EHR and PII using standard technical controls such as firewalls (perimeter and enclave), VPNs, network and host IPS, and endpoint protection
  • Monitor all system and network activities, optimally with automated detection of suspicious activity, particularly as it affects systems containing EHR and PII

Exposure of sensitive data is but one of the salient observations in the IBM X-Force 2012 Annual Trend and Risk Report. Download it now and get the full picture of how 2012 shaped up in terms of threats and gain intelligence into what to expect in 2013.

 

IBMSecurityIntelligence AUTHOR: IBMSecurityIntelligence
Analysis and Insight for Information Security Professionals

Posted in: Automation Testing, IBM, Opinion, Security      
Comments: 0
Tags: , ,

 

Widespread Agile Adoption

As revealed by this year’s World Quality Report (WQR), an impressively high 93% of IT leaders say they are using agile methods in at least some of their development projects – a rise of 33% since 2010. However, in spite of the fact that 36% of all testing is now performed within agile projects only 14% of IT leaders are confident that they are not experiencing any major problems with agile testing.

In agile projects, testing is tightly integrated into the core structure of every build, because new features are added in each Sprint and it is essential to ensure that these don’t adversely affect the functionality and performance of the previous app releases. This integration is proving to be even more important as agile, and therefore agile testing, are extending into DevOps to support the continuous deployment of changes and updates to production environments. Whereas in traditional waterfall projects businesses often leave the vast majority of testing until the end of the project, right before deployment, in an agile project the cross functional teams undertake planning, development, testing and evaluation all at the same time, in every Sprint.

Biggest Barriers to Success

This widespread adoption of agile methods and testing practices, coupled with a lack of maturity, has highlighted several barriers to agile success; all of which can easily be overcome once they are pinpointed and properly understood. Amongst the IT leaders we surveyed, the following problems were of most concern:

  • 61% feel that their biggest stumbling block on the road to success in agile testing is the lack of a good, proven test strategy that works in an agile environment
  • 55% say they have difficulty applying test automation at appropriate levels in agile projects
  • A lack of testing tools that enable building reusable test sets is a chief concern for 42% of those surveyed.
  • 35% cite insufficient speciality testing expertise in Agile teams as a major issue.

Round Peg, Square Hole

These problems may well stem from the fact that 43% of IT leaders still do not use a specialised approach to testing in agile projects and, whilst this is slightly down from 46% last year, it is clear that a large number of businesses are trying to apply agile methods using the same tools they used in their waterfall projects, and with insufficient numbers of dedicated agile test experts.

According to ComputerWeekly, Jose Casal from the Agile Methods Specialist Group at the BCS agrees that most organisations are undertaking incredibly shallow agile adoptions. He said: “Deep agility in the UK mainstream is probably a decade away…to help bring it forward we need to go back to the fundamentals of agile and ensure that people understand why agile is what it is, rather than a passing fad.” To help achieve this, the BCS has introduced a wide range of agile certifications.

Comparethemarket.com is also looking at the bigger picture and has devised its own specialist agile training programs for their development staff. They also work with several universities to ensure that up and coming testers are fully equipped with not just technical agile skills, but also the personal and behavioural qualities that are required to work in an agile environment.

Automate Your Way to Agility

In agile projects, the central focus should be on speed and quality. Test automation can be a fantastic way to reduce the time spent on regression and integration testing, resulting in faster delivery and enhanced app quality, however 35% of WQR respondents find difficulty in repeating tests across sprints and iterations.

The best way to achieve the necessary level of automation is to outsource the testing to a team of dedicated test professionals who are already adept at using an agile toolset and used to creating automated test environments.  Independent testing is the most effective way to prevent a conflict of interest in validating requirements, so businesses need to engage specialised testers in order to ensure the required level of application quality. However when you decide to outsource, selecting an appropriate partner goes far beyond simply looking for the best price. You need to look for a partner who has genuine expertise in agility, is able to produce high levels of automated tests and who can devise a solid agile test strategy from the outset of the development process.

A Risk-based, Test-driven Approach to Agile

A good agile test strategy will allow the test lead to use risk-based analysis techniques to define the focus of the testing effort as early as possible in the Sprint. This enables the test team to develop logical test scenarios using a test-driven approach and adopt agile automation tools to ensure they can optimise the test scripts by reusing them for every iteration. Agile teams should define these testing objectives and then build test scenarios before writing any functional code. Then, while the code is being created, testers can focus on building automated test scripts.

More than ever before, testers in an agile environment need to have a thorough understanding of business processes and development techniques, and create a strategy that is aligned to the wider needs of the business.

Are You Shore?

The rise of agile appears to have contributed to a decrease in offshore development models, because agile teams need to work closely together to be effective.  A hybrid approach is likely to be the most effective for the majority of businesses at the current level of agile maturity that we’re seeing. In the hybrid approach, testers from a corporate or domain-specific Testing Centre of Excellence (TCOE) collaborate with the individual project teams to maintain close relationships with discrete areas of the business, while reporting into a centralised testing leader who collates metrics reports from all the teams to get a big picture view.

Carl Bruiners, Agile Consultant at home furnishings retailer Dunelm Mill (and ex GE), believes that we should look at productivity, not just cost when deciding levels of offshoring. An offshore developer may be cheaper but, he says, “It is often narrow sighted not to consider the other factors you inherit when offshoring – language difficulties, communication latency, culture…”.  For an Agile approach to be successful teams really need to be in the same time zone. Remote teams should be avoided unless it is impossible to do so and, rather than split a team into two geographical locations, create two smaller discrete teams in each place to ensure that they are all able to fully collaborate on every aspect of the project.

Embracing the Learning Curve

At the UK Agile Awards last year, Mike Burrows, agile expert at management consultancy firm David J Anderson & Associates made a poignant point that may help IT leaders take the leap to fully immersing themselves in agile. He said “Agile could be said to mean accepting the truth that we can make progress with incomplete information. That makes the whole process – not just the specification part of it – dedicated to the process of knowledge discovery. Once we understand this, failure is neither shied away from nor pursued recklessly, but accepted as part of the learning process.”

As the WQR statistics have shown, there is a growing willingness to fully adopt agile methods and as with big changes there are concerns and teething problems, particularly in agile testing, due to a lack of knowledge and maturity.

In conclusion, as we have seen, the best ways to overcome these issues are to choose a specialist Agile testing partner with both onshore and offshore capabilities to create a hybrid solution and with proven test strategies, tools and processes offering a high level of automation, so allowing the whole team to focus on the agile principles of speed and quality.

You can download your free copy of the World Quality Report here and discover the business benefits of Sogeti’s Agile and Agile Testing services here.
Read Part 2 of my World Quality Report series here.

Darren Coupland AUTHOR: Darren Coupland
Darren is the sector head of telecommunications at Sogeti.

Posted in: Automation Testing, Big data, Outsourced testing, World Quality Report      
Comments: 0
Tags: , , , ,

 

TMap HD Webinar

Following Sogeti Delivery Director, Barry Weston’s Webinar on implementing TMap®HD, a quality-driven test approach to new technologies and development methodologies, we thought it would be useful to have a look at the potential challenges these advancements pose to your business.

You can replay the webinar here.

A New Dawn

Computing everywhere; the Internet of Things (IoT); cloud; advanced, pervasive and invisible analytics; software defined apps & infrastructure and risk-based security & self-protection. These are just some of Gartner’s predicted top 10 strategic technology trends for 2015. (Gartner Symposium/ITxpo 2014, October 5-9 2014, Orlando.) Whilst no one is suggesting that businesses should adopt all 10 trends at once, Gartner advises that we should all be making strategic decisions about each of them over the next 2 years. So what does this mean for testing? Well, these complex, interconnected technologies are causing IT leaders to seek a simpler, more flexible, business-aligned testing methodology that focuses on delivering high quality products to market in the shortest timeframe possible.

It’s not just Social, Mobile, Analytics and Cloud (SMAC) and IoT that are shaping test strategy: 93% of IT leaders are using Agile methods in at least some development projects and just over 1/3 of all testing is now performed within Agile projects. When you also consider the World Quality Report’s recent findings that only 14% of IT leaders are not experiencing major challenges with Agile testing, it’ even more evident that businesses require a new test approach that enables faster, more accurate, end to end testing that comes into play at the very start of the product lifecycle.  (Statistics from the World Quality Report 2014-15)

Rage Against the Machine

So what are the pain points that your new test strategy needs to overcome? Well if your business and customer apps are leveraging SMAC then the focus is on delivering services to as many customers as possible on the broadest range of devices available. This means the testing requirements shift from attempting to achieve zero defect status to discovering the fastest most cost effective means of discovering and fixing bugs and rolling out patches with minimum negative impact on profit and business needs. Technological advancement means the role of the tester is dramatically changing and either traditional testers need extensive training to familiarise themselves with new architecture, infrastructure, test methods and strategies, or businesses need to hire new specialists in these areas.

The Internet of Everything with its numerous M2M connections, have given birth to unparalleled levels of potential risks both in terms of security and the possible domino effect of an undetected bug rippling from one connected device to another. The necessity for increased test diversity, such as testing on simulators and consoles, has also made automated testing significantly more complicated.  Another major area for concern when adopting multi-platform tech solutions, is the current deficit of effective testing tools and the difficulty in accessing and integrating those that are available.

Mapping the Future

To combat these challenges, Sogeti has enhanced our well known risk based, business-driven TMap® test approach to formulate TMap® HD, a new, adaptive, customisable, building block approach to project and program test management, execution and quality assurance. In step with current technology trends, it also accommodates Agile and Scrum methodologies. The end to end test process is embodied in a defined, structured framework and optimises testing services, enabling you to identify defects earlier, reduce timelines by at least 30%, transfer knowledge across teams and drive down overall development costs. TMap HD is a human driven system based on 4 elements: simplicity, integration, industrialisation and people which together, add up to complete test confidence as test requirements and expectations are met even in the most complex scenarios.

Sogeti’s Delivery Director, Barry Weston recently hosted a free 30 minute webinar introducing TMap® HD and explaining all of the business benefits. To discover how to apply the TMap® HD building blocks to test new technologies effectively, replay the webinar here.

AUTHOR: Sogeti UK Marketing team

Posted in: Agile, Business Intelligence, Cloud, Digital, Events, Mobility, SMAC, Social media, Software testing, Test Methodologies, TMap, TMap HD, Webinars      
Comments: 0
Tags: , , , , , , , , , , , , , , ,

 

In the first part of this blog we discovered that the World Quality Report statistics show that IT leaders are recognising the need to improve their processes, save costs and boost their QA activity in order to create a loyal customer base. This is why the total percentage of IT budget devoted to QA is on the rise. We also examined what this looked like in practical terms, by focussing on Vodafone and British Gas’ Energy Data Management Project. In this post, part 2, we can see how capitalising on some of the new SMAC technologies can facilitate a smooth digital transformation that actually meets customers’ ever increasing expectations.

Capitalising on Cloud

For the past couple of years Cloud has been on the decline, but it is now steadily gaining momentum, with 28% of apps hosted in a Cloud environment and an expected rise to 35% by 2017. 32% of testing projects currently rely on cloud infrastructure, allowing testers to ramp up load capacity, reducing cost and optimising app delivery times. Cloud based testing platforms are being used to test business critical internal applications like CRM and ERP functionality, performance and security.

The benefits of the cloud are clear in the Microsoft, iStreamPlanet and Adobe scalable cloud based solution for NBC’s coverage of the 2014 Olympics. By moving their media processing to a cloud based architecture that took advantage of the substantial benefits of Windows Azure and Windows Azure Media Services, they successfully live-streamed 41 channels continuously for 18 days, delivering 3000+ hours of high definition, multi format, real tome content to a vast range of devices on IoS, Windows and Android.

Buy it, Sell it, Love it – Tell Everyone

Social Media is growing up: 71% of those surveyed for the World Quality Report said they rate Social as important in customer feedback for customer facing apps. Social media Metrics and Analytics are essential for creating an effective Online Reputation Management Strategy (ORMS) and a good means for a test team to measure existing test processes and outcomes against customer satisfaction.

eBay has proved to be a master at leveraging social media (for example with @askeBay on Twitter) to evaluate and capitalise on multi-channel opportunities and enhance customer experience in a digitalised environment. Since starting their social customer services in 2011 eBay has come to realise that the marketing department was not equipped with the right resources and information to field all the queries coming through social channels, which was causing response time and solution failures. They needed to spread the service across the business and ensure that they used the intelligence it provided to prevent similar problems happening to other customers in the future, whilst also determining the best way to scale up to meet spikes in activity without wasting resources and costs. eBay adopted a similar approach to O2 – they now have hundreds of staff outside of the core team, located business-wide, who know how to use the system and tools to respond to Social customer enquiries. Customers now get the right advice from the relevant expert while duplication is avoided by a simple “lock out” system once the query has been picked up within the customer-driven obligatory response time of 1 hour or less. They even respond to social mentions where a customer has not approached them directly but is discussing their brand with another social user. In the true spirit of social media, their customers seem delighted that the brand is joining in the conversation and above all listening and responding to them!

You can find more detailed information on how the other SMAC technologies are affecting digital transformation projects and positively impacting customer experience by downloading your own free copy of the full World Quality Report here.

Solutions for a Smooth Digital Transformation

If you’re just starting to consider the cloud or are already utilising services from other providers, we can help you define a more strategic approach with our Cloud Readiness Assessment, which enables us to develop a tailored strategy based on the complexity of your business, projects and desired scale of implementation. One of the many solutions we offer is Azure – Microsoft’s cloud-based, pay-as-you-go, platform for developing, managing, and hosting applications.

Our Social Media Analytics Services also support you in monitoring your brand reputation and using data from Social Media in your operations to action better decisions.

Read Part 1 of this blog series here.
Look out for Part 3, when we will be looking at the rise and rise of Agile.

Darren Coupland AUTHOR: Darren Coupland
Darren is the sector head of telecommunications at Sogeti.

Posted in: Cloud, Transformation, World Quality Report      
Comments: 0
Tags: , , , , ,

 

Quality Breeds Loyalty

It’s no secret that our “always on” world has taken business customers’ and consumers’ expectations to new heights, with both groups demanding that apps and services deliver peak performance and optimum reliability, with a consistently seamless, multichannel end-user experience. But how does this trend affect Testing and Quality Assurance (QA)?

As the World Quality Report 2014-15 (WQR) key findings clearly show, IT leaders are recognising that, in order to create a loyal customer base, they need high quality applications which deliver a great customer experience. Nothing turns off customers like applications that don’t work or are difficult to use or non-performant, all of which can lead to in lost customers and lost revenue.   As a result they need to boost their QA activity; which is why the total per centage of IT budget devoted to QA has risen from 18% in 2012 to 26% in 2014. The WQR also indicates that 15% of the most forward thinking C-suite executives are already spending 40% of their IT budget on quality-related activity, with 1 in 5 forecasting a rise to this level in the next 2 years. The rapid advancement of Social, Mobile Analytics & Cloud (SMAC) technologies, as well as the Internet of Things (IoT), has led to 53% of the overall test budget now being allocated to new development projects. In turn, we’re seeing a distinct shift away from maintenance projects, testing new releases of existing apps and integration initiatives.

We will take a look at how digital transformation and customer demands are driving changes in testing and development in this 2 part blog. In this, part 1, we examine how customer demands such as being greener, meeting industry regulations and saving costs give rise to the requirement for a change in process and an increase in QA test budgets. We will also touch on how globalisation and digitalisation are driving a hybrid QA and testing activity solution. In part 2 we will look at live examples of how some of the new SMAC technologies are affecting digital transformation projects and positively impacting customer experience.

The Smarter Path to Transformation

With their Energy Data Management project in partnership with British Gas, Vodafone has wisely capitalised on the fact that the Energy & Utilities sector has been greatly unsettled by reports that Global energy demand is set to double in 2050, causing concerns about climate change and giving rise to more restrictive regulations about usage and emissions. This, coupled with time to market pressures and the necessity for a good customer experience, have outstripped even cost saving as a competitive differentiator in the Utilities sector!

The upshot of these industry concerns is an increase in digital transformation projects focussing on smart metering and smart grid with the result that Energy & Utilities (and also Transportation) have shown the biggest industry increase in QA spending, at between 27% and 31%. Vodafone and British Gas have identified these areas as their customers’ greatest pain points and offered them bespoke energy solutions that use Machine to Machine (M2M) and mobile recognition technology to give “real time” electric, gas and water stats every 15 minutes! This improved granularity enables them to identify where and when energy is used in your business, pinpointing wastage and resulting in up to 20% standard energy and cost savings, plus up to 25% savings on carbon tax. The fact that Vodafone and British Gas both use the technology to monitor their own usage shows their belief in the initiative!

Digital Adaptation

The World Quality Report shows that globalisation and the necessity for digital transformation are dissipating the trend for the pure centralisation of QA and testing activities in favour of a more hybrid centralised and localised solution, facilitated by partial outsourcing to a 3rd party expert test partner. In house testing has decreased rapidly year on year, from 51% in 2012 to 41% in 2013 and 30% this year. This new hybrid trend promotes shared responsibility and, thankfully, enables a much better level and quality of responsiveness to the individual needs of the various different lines of business.
Paypal only began to establish localisation at a very late stage of their development and quickly realised that their global services were not meeting customer’s expectations. They sought to improve the quality of their regional products so they could support millions of customers in over 190 countries, in their own language. The overriding goal was to increase their Net Promoter Score (customer satisfaction rating) so they could determine how loyal their customers are to their brand. Initially, with the help of an outsourced partner, they created a centralised team in Beijing with multilingual QA engineers to support Paypal’s centralised Global Localisation Team in testing their major releases and country specific features. In the last 4 years they have eliminated the vast majority of bugs, achieved full localisation and automation and transitioned a team from solely dealing with Language Quality to fully supporting global payments in 25 countries. Paypal now provides localised web and mobile services in 21 world markets.

Sogeti Solutions for a Smooth Digital Transformation

For clients who feel offshore delivery isn’t the right option, Sogeti’s UK lab, Sogeti Studio, coupled with the Capgemini Group’s highly skilled web and mobile testing resources, provides you with access to close collaboration, local resources and devices, and the opportunity to see your applications being tested in practice.

Look out for part 2 of this blog post on Digital Transformation and Testing with SMAC technologies.

 

Darren Coupland AUTHOR: Darren Coupland
Darren is the sector head of telecommunications at Sogeti.

Posted in: Cloud, Quality Assurance, Reports, Research, Sogeti Studio, Transformation, World Quality Report      
Comments: 0
Tags: , , , , , , ,

 

Trusteer Rapport defends against Carberp malware.Earlier this week, IBM reported that the Carberp malware source code was on sale in Russian forums. Since then, the source code was leaked and is now available to criminals and researchers alike for free.

The leaked Carberp source code includes an “anti-Rapport” function as well as an old copy of IBM Security Trusteer Rapport’s installer for testing purposes. Over the years, we’ve seen many anti-Rapport modules incorporated into different malware strains. It started back in 2009 with a series of anti-Rapport modules incorporated into Zeus, then a couple of anti-Rapport modules incorporated into SpyEye. It has now been incorporated into different malware strains, including Carberp. In terms of functionality, these modules try to achieve one of three main goals:

  • Prevent Trusteer Rapport from installing on the computer
  • Remove Trusteer Rapport from the computer
  • Avoid one or more of Rapport’s browser protection mechanisms

Over the years, we’ve learned how to effectively fight these attempts with a combination of strong intelligence capabilities, multiple layers of protection and rapid response capabilities.

Just as with previous attempts, Trusteer Rapport protects users from Carberp and is not affected by the anti-Rapport function. This was true before the Carberp source code leak and is still true now.

Learning More About Carberp

IBM intelligence operations collect current threat data from tens of millions of protected endpoints and other sources around the world. Our special response teams track every piece of financial malware 24/7 and can swiftly analyse it and develop countermeasures. By designing an infrastructure that allows for maximum flexibility, Trusteer Rapport can quickly adapt to any change in the threat landscape and shut down cyber criminals’ window of opportunity. As it combats sophisticated financial malware, Trusteer Rapport represents a serious roadblock to malware-based fraud. We take great pride in the fact that criminal groups see us as a threat to their livelihood and are constantly trying to find ways around us. We also remain vigilant and keep enhancing our intelligence capabilities, product functionality and operational processes.

Update: July 25, 2013

A French researcher was able to find a scenario in which our Carberp protection mechanism didn’t kick in. While we believe this is a rare scenario, we activated another layer to protect against it. We would like to thank this researcher for his help and cooperation.

 

IBMSecurityIntelligence AUTHOR: IBMSecurityIntelligence
Analysis and Insight for Information Security Professionals

Posted in: IBM, Innovation, Security      
Comments: 0
Tags: , , , , ,

 

To push forward Sogeti Studio, as well as the more typical Sogeti service offerings, it is important to maintain an understanding of test tools available in the market. This is not just from the perspective of Sogeti purchasing such tools, but we may be able to help a client make a selection from a range of options, or we may even have consultants at a client where there are new tools in place already. One such tool is from IBM, called Service Virtualisation (SV) – see links to useful YouTube videos for this tool at the foot of this blog.

In IBM’s own words, “the main aim of Service Virtualisation is to deliver types of testing (ie E2E functional, regression, load, integration) to address challenges of highly complex and integrated applications much EARLIER in the overall project lifecycle.” I am sure many of you can relate to client projects where you have worked with numerous integrated applications; the key selling point from IBM is that the tool can enable test teams to automate integrated testing much sooner, which can contribute towards the much talked about holy grail that is the ‘shift left’ concept.

IBM provides clear, written instructions on using the tool which effectively simulates virtual applications within an environment such that integrated testing can be executed, even if that actual application is unavailable due to development reasons. In addition, testers can create their own ‘partial’ environments as the tool creates stubs and drivers in place of actual environments. This is useful in environments which send ‘acks’ from one system to another on receipt – i.e. for high-volume financial businesses. The tool is used to provide access for software development and QA/testing teams to dependent system components that are needed to exercise an application under test (AUT), but are unavailable or difficult to access for development and testing purposes. With the behaviour of the dependent components “virtualised,” testing and development can proceed without accessing the actual live components.

Benefits of Service Virtualisation:

Key benefits of the tool can include:

- More time available to do development or testing work, or simply getting the same work done in less calendar time, so faster time to market for the application.

- Lower cost, often in the form of avoiding lab hardware and software expenditures on prerequisite systems required for the application, but out of scope for the development or testing activity.

- Higher quality, because typically the environment is stabilised, and so defects are caught much sooner in the software life cycle than they would be without Service Virtualisation (SV), where the team waits until the entire application assembly is available before beginning ordinary testing activities.

In order to move to an effective agile development process you truly need the connectivity to the interfaces that the new code will be interacting with. This could be real systems but that would require countless amounts of hardware and application software to support the vast number of agile development teams. SV addresses this constraint by allowing each development team, and potentially each developer, the ability to have their own view of the entire application stack with all interfaces represented.

Conclusion:
Over the last decade, virtualisation has transformed the manner in which IT services are delivered, by providing a platform to reduce cost through consolidation, increase agility and deliver improved service levels. However, virtualisation does present new challenges to IT teams e.g. how to design and deliver the infrastructure to support this; how to secure and manage the virtual environment; how to ensure that issues are captured before they go into production or live environments etc.

Find out more about Sogeti Studio and it’s capabilities here.

Youtube links:

What is Service Virtualisation? http://www.youtube.com/watch?v=Np5_O43BFD4&index=36&list=PLEE1757606E9348F5

When to use Service Virtualisation? http://www.youtube.com/watch?v=j1f5vP3gCIM&index=30&list=PLEE1757606E9348F5

Service Virtualisation – Return on Investment http://www.youtube.com/watch?v=LrrcKl8Gb9w&list=PLEE1757606E9348F5&index=34

Service Virtualisation – Increasing Team Velocity and Delivering Higher Quality Software http://www.youtube.com/watch?v=npimRG1bhNQ&list=PLEE1757606E9348F5&index=20

 

AUTHOR: Richard Payne

Posted in: IBM, Virtualisation      
Comments: 0
Tags: , ,

 

The area of accessibility testing (aka testing for functionality to suit users with certain disabilities) is something which is becoming increasingly important in our industry, especially with the move to digital strategies and displaying information across multiple devices and platforms. W3C standards for Web Content (hence the acronym) have introduced standards and guidelines to assist companies and development teams to deliver websites that cater for users with disabilities to make accessibility easier, or in some cases possible, on pcs or mobile devices.

Users who might benefit from W3C standards may have one or more of the following conditions: visual or hearing impairments, motor impairment, seizures, or cognitive disabilities such as dyslexia.

As much as some of you are probably well versed in this, and maybe even experts, there will be some of you thinking ‘well, so what?’. The thing is; an increasing number of organisations are delivering functionality into their websites to comply with W3C standards, therefore Sogeti, and our Studio, are concerned with an awareness of the types of functionality which could be delivered for testing and how to go about testing it effectively across various platforms, including mobile devices.

Now, we don’t want to lecture our clients on what sort of development they should be delivering with respect to accessibility; however, Sogeti believes working in partnership with our clients to raise awareness of W3C considerations will assist clients to build a solution that can be accessed by all – after all increasing your audience should lead to increased conversion rates on ecommerce platforms or simply widening the message that your company is trying to deliver to market.

In essence, the laws of the land which have pushed this are forward are; The Disability Discrimination Act 1995, Special Educational Needs and Disability Act 2001, and the Equality Act 2010.  The main organisations behind these standards are: the World Wide Web Consortium, which has produced the following interesting guidelines (http://www.w3.org/standards/webdesign/accessibility); and the British Computer Society Disability Group: http://www.bcs.org/category/18035.

In terms of mobile testing for the W3C standards, the guidelines apply across all platforms as they are designed to be broadly applicable to current and future technologies. Obviously, as testers, we are constrained in terms of what our clients deliver by way of accessibility functionality; that said we do have a part to play in terms of helping clients formulate requirements/deliverables if possible. 

Given that rendering of websites can differ across platforms, it is very important that when testing for accessibility on a client project you must consider mobile testing of the same functionality using various devices. Even if the accessibility standards are consistent across all devices, what the client delivers may not work the same across all platforms/devices. It is up to us as testers to consider mobile usage for testing these standards, and steer clients accordingly.

Hardware for accessibility (i.e. special keyboards/ rests/ mouse types) often accompanies PC testing, and may not apply to mobiles. However there are some software types that should be considered when planning mobile accessibility testing (i.e. speech recognition/ magnification). This short guide (here) document lists areas to consider for delivering accessibility testing, plus suggestions of tests from the W3C website which can be adapted or used as they are.

Why not get in touch, and see how Sogeti Studio can assist you with your Accessibility considerations. Email us today at: enquiries.uk@sogeti.com

AUTHOR: Richard Payne

Posted in: e-Commerce, Human Resources, mobile testing, Sogeti Studio      
Comments: 0
Tags: , , , ,

 

HybridCloudWe already discussed some time back how important the cloud is today and the benefits that this technology can bring to the client, however, the cloud still comes with doubts and fears: data security, cyber-attacks (don’t miss Sogeti’s book about it), governments intrusions, lack of supports, etc. This is why  hybrid-cloud is more and more a trend today and a good compromise between private and public cloud. How can we help our clients with this challenge?

Hybridcloud merges the benefits of public cloud (Amazon, O365, etc.) with the comfort of an on premise technology. Let’s think how our clients need to communicate with external partners that are outside their network or Active Directory in terms of authentication. A hybrid approach will be able to keep completely independent the two ecosystems: sensitive information will remain local, while everything that needs to be shared can live in the cloud. Especially in programs like ECM (Enterprise Content Management) where collaboration (internal or external) is crucial, hybrid-cloud can make life much easier.

Moreover, let’s also remember the benefits in terms of costs, switching from a CAPEX model (you buy all the required hardware, even if you use it only at its 10%) to the OPEX approach, paying only what you consume. Software editors have clearly understood what companies are looking for and many collaboration products today are moving in this direction. A couple of these are NemakiWare, an open source CMS, or the hybrid solution offered bySharePoint 2013.

Of course the hybrid-cloud is not magic and we have to keep in mind all the pros and cons of this approach:

PROS

High performances: Network must be secured, but with the use of virtualisation a high performing environment can be built.

Scalability: When demand increases, the cloud can extend capacity almost immediately and keep safe the flexibility.

Security: While many people continue to think that the cloud is not secure (which I don’t believe), the hybrid-cloud guarantees isolation of really private content to the company

Low price: With very low operational costs and no hardware maintenance for the cloud, the companies can focus their efforts on improving features for their businesses instead of buying servers that are only partially used.

 

CONS

Infrastructure: To put in place all the necessary configuration to have a secure hybrid-cloud is not easy and has major drawbacks.

SLAs: The SLAs need to be constructed carefully and be able to serve both your users on premise and in the cloud.

Data Protection & Compliance: There will be 2 independent environments that will need to respect the company compliance rules and data needs to be protected and secured in two separate ecosystems.

Complex Networking: The network in a hybrid configuration plays for 70% of the solutions. This networking is complex and high competencies are necessary.


So, what’s the point? Our clients are looking for scalability and flexibility and we can reach these goals only by being flexible ourselves and creating flexible solutions.

To read the original post and add comments, please visit the SogetiLabs blog: Hybrid Cloud, Hybrid clients…hybrid solutions!

Related Posts:

  1. The cloud as disruptive move toward Sustainability
  2. Cloud shadows on corporate IT ( Part 2 )
  3. How would you prefer to be robbed? (how to convince your boss or your grandmother to join the Cloud)
  4. Is Cloud a return to the Stone Age?

 

Manuel Conti AUTHOR: Manuel Conti
Manuel Conti has been at Sogeti since 2010. With a technical background he leads onshore and offshore teams mainly in the use of the Microsoft SharePoint platform, building intranet and internet web sites.

Posted in: Cloud, Opinion, Security, Virtualisation      
Comments: 0
Tags: , , , ,

 

On January 12, SogetiLabs Fellow Daniël Maslyn hosted a webinar on Robotics and IT Testing. You can watch the full recording (starts at 6’45”) by clicking here (we apologize for the poor sound quality due to a technical issue with the platform). Daniël’s deck is also available below.

The next SogetiLabs webinar is schedule on February 2 at 5pm CET.

  • Topic: The Recover Approach: Reverse Modeling and Up-To-Date Evolution of Functional Requirements in Alignment with Tests
  • SpeakerAlbert Tort

To read the original post and add comments, please visit the SogetiLabs blog: Webinar: Uniting Robotics and IT Testing

Related Posts:

  1. Uniting Robotics and IT testing
  2. The H+ shift of Google (Part 3/4: Robotics)
  3. Modelization of Automated Regression Testing, the ART of testing
  4. Testing is dead, long live the Tester.

 

Daniel Maslyn AUTHOR: Daniel Maslyn
Daniël Maslyn is part of the Sogeti Labs Group and is a passionate and creative software testing professional with over 15 years of experience in real-world situations ranging from hands-on operational testing roles to test management positions. Knowledgeable in a variety of test methods, techniques and testing paradigms.

Posted in: Webinars      
Comments: 0
Tags: , , ,