Wikipedia defines App Store Optimisation (ASO) as the process of improving the visibility of a mobile app in an App Store such as iTunes or Google Play, similar in concept to the technique of Search Engine Optimisation (SEO) used for websites.

More specifically, ASO is defined as the process of improving the ranking in an App Store’s search results and top charts rankings, in order to drive more downloads of the app.

Worldwide web ‘search’ is dominated by Google; however the App Store industry is dominated by two major players, Apple and their iTunes store, and Google/Android with Google Play. Between them they have more than 90% of the apps download market, although there are other companies such as Blackberry with 3% of downloads, and Microsoft with 4%, it is widely accepted that commercial ASO activity should concentrate on Apple and Google platforms foremost with analysts predicting a pre-eminence of Google Apps in the medium term due to the Open Source nature of Android.

In recent years there has been an explosion in the number of apps available – 2014 figures show that the iTunes store has upward of 1.2 million for download (with an estimated $10 billion paid to developers) and Google Play at 1.3 million (with an estimated revenue of $3.2 billion to developers). The number of app users across the world is expected to grow between 25 and 30% per year.

As the number of apps has increased, the chances of a potential user finding a single app (with or without searching for it) has plummeted, and this has led to a whole new discipline, with ‘App Marketers’ now carrying out ASO to tap into this lucrative market and ensure their client’s apps appear near the top of charts and search results for specific keywords, have higher rankings than competitors’ apps.

The name ASO is only around 3 years old. Methods are still somewhat patchy, with focus around Keyword Optimisation – also known as Keyword Research – which is researching, analysing, and selecting the best keywords to target and drive qualified users to the app (as per standard SEO for web).

Research on the subject suggests that following areas are widely accepted by ASO practitioners as best practice:

On Page ASO Optimisations

First a note on use of ASO Keywords:

Intelligent companies clearly have an understanding of their customer base and the language their customers use – which can be translated into typical keywords those customers are likely to use when searching for mobile apps. In addition, knowing what keywords competitors use for their products can be important – although what you do with that information is debatable, for instance whether to go for popular keywords for your own app and risk competing with a rival, or to try and find niche keywords instead.

What is beyond debate though is that use of keywords, and an understanding of that, should be viewed as a fundamental aspect of app development from the start, and not just something dreamt up 10 minutes before publication. The tweaking of Apps search keywords should be something done throughout an app’s lifetime in order to help optimise potential downloads.

Those who have studied ASO indicate that the best uses of keywords include both Functional (say, describing features) and Categorical phrases (i.e. describing primary market or category), with secondary and even tertiary choices also being made.

Google Autocomplete is also a powerful feature to be aware of and to use when deciding keywords. This is the suggested searches that Google and Apple present as you start typing and, although they don’t publish the algorithms they use, we can safely assume they are based on popular query terms.

Research also indicates keyword misspellings very often feature – Google Play indicate 50% of App searches have popular misspelt queries – so this can be another area to capitalise on.

There are a number of other best practice suggestions around keywords, such as not repeating terms, not including singular and plurals of keywords, ignoring small ‘connectors’ – “a” and “the”, and also avoiding use of competitors names.

Use of Keywords in App Title

Taking likely keywords and having the most relevant in the app’s title is the first and most important stage in improving the search rankings for an app, in fact research shows that just having a relevant keyword in the title makes a typical app 10% more ‘successful’ than one without. However, in a recent report around 84% apps in the App Store did not have any keywords in their titles!
The title tag should be descriptive and indicate what the app does, but be kept short to avoid issues with, for instance, different Android screen sizes and resolutions – truncation is considered a real turn off. Apple has a 25 character limit for this.

The keyword in the app title should be the heaviest used keyword by searchers, if it isn’t, efforts will be wasted. App titles also shouldn’t change either, especially when launching new versions. Users speak to other users and recommend downloads, and changing the title is perhaps the easiest way to prevent an App being searched for and found.

Use of Keywords in the App Description

Similar to Meta description tags on a website, keywords for your app’s description must clearly describe what the app is, what it does, and what the benefits are for users. Developers are increasingly employing professional Copywriters to get this right. I’m not suggesting everyone should do this, but with a 4000 character limit on descriptions in the iTunes store, and the app description being the primary way of advertising the benefits of an app to a user, your words need to be powerful and succinct.

Use of an App Icon

The App icon is the first thing people see when they view a listing, and it should convey what the app is in the simplest, most visual manner. The benefits of having a great icon shouldn’t be underestimated, and the most popular apps (Facebook etc.) all have a very recognisable brand icon.
Developers are increasingly hiring professionals to design logos.

Use of an App Screenshot

Again, humans are visual creatures, and users want to see a preview of what they’re getting before they potentially spend their money. Research shows that screenshots are best appreciated when they convey a specific benefit of the app, and supplemental text should be used whenever possible to clarify each screenshot.


It’s so obvious, but to be successful an app should be categorised appropriately – research shows that more than 50% of users scroll through categories to find interesting apps that they then download. A secondary and even a tertiary category should also be considered.

YouTube Demo

Google Play Store recently added a new feature that allows you to upload a YouTube video of your app in action. In this increasingly competitive market, it is becoming more and more important to highlight the best parts of an app, and video is an ideal way to do this.


Translating an app into different languages can greatly increase downloads, and expose it to a larger potential audience. In one study, localisation increased downloads of an iPhone app by more than 700%. By no means translate into every language – carefully consider which audiences will be most popular and just target them.

Leverage of Plug-Ins

Although it would be difficult to prove, Google almost certainly prioritises apps that feature Google + plug-ins.

Off Page ASO Optimisations

Ratings/ Reviews

Ratings and Reviews are also very important aspects, and trust us users do look at reviews of apps before downloading. Therefore a developer should ensure that an app gets genuine reviews, and in the case of negative reviews, leverage this feedback to overcome user’s issues and recommendations. It’s therefore important to encourage happy customers to put reviews for the app, whether that’s a ‘Rate this app’ popup after people have been using it for a while, or another method.

Link Building

Taking Google as an example, the Play Store accesses Google’s search indexes, so this means that links from popular and authoritative websites will certainly help the popularity of an app. Work with bloggers to get them to review the app, make sure there’s a link to it on your website, and across other relevant sites that your users will be visiting.

A Note on Number of Downloads

Although figures are hard to come by as companies typically keep download data confidential for commercial reasons, popular apps show higher on download charts, and users undoubtedly download apps based on their popularity – and not even necessarily on need. This is not something we can easily manipulate, but it is something we could keep in mind. Our ASO/ Mobile Apps Testing Service can help to build your app profile in App Stores, leading to more downloads.

White Hat versus Black Hat

White Hat is a term used for performing ASO in a way App Stores would approve of and accept.

Black Hat ASO is a term for performing ASO in a way App Stores would definitely not approve of or accept, such as falsifying downloads or ratings, perhaps by using bots or other techniques to make it seem like an app is more important and influential than it actually is. For reputational reasons and to ensure you’re not blacklisted by App Stores in the future, you should not to stray into any technique, or work with any developers doing anything even close to what could be considered as Black Hat.

Testing of ASO

Discovery/ Tracking

There are a number of Open Source app keywords spy/discovery tools which will state that they will pull (and in some cases analyse/ compare) the keywords used by an app. Sogeti Studio demos a number of these offerings, to understand how they function, and how they can be used.


There are analytics tools available to track app keywords, and to check where a mobile app stands in terms of popularity. Free tools include Google’s Mobile App Analytics, MobileDevHQ, Distimo Analytics, and Mopapp.  And paid solutions are Mobile Action, AppCodes, AppTweak, Sensor Tower, and Searchman. Sogeti Studio will soon carry out a study of the tools available in the marketplace, so you can find the one that is right for you.

What Testing Service could Sogeti Studio offer a client?

Sogeti Studio is our UK-based web and mobile testing lab. Current services offered within the Studio include mobile strategy, application development, testing (functionality, compatibility, performance, usability, security and accessibility) and portfolio management. ASO can be incorporated into our development and testing cycles – both on-screen and off-screen – with recommendations for improvement on:

  • - Use of keywords in App Title;
  • - Use of keywords in App description;
  • - Quality of App icon (if even present);
  • - Number and quality of screenshots (if even used);
  • - Applicability of App categorisation;
  • - Quality of any YouTube demo videos with the App;
  • - A marking for localisation efforts;
  • - A rating for including (say) any Google Plug-Ins.
  • - Number and type of User comments, use of forums etc.
  • - Availability of App as a link on specified key sites.

Contact us today at, or call us on +44 (0) 20 7014 8900.

AUTHOR: Jason Mudge

Posted in: Business Intelligence, Opinion      
Comments: 0
Tags: , , , , , , ,


The first Testing Showcase North event was held in the labyrinthine Palace Hotel in Manchester on Thursday 19th February 2015.  The event was organised by UNICOM, with sponsors Capita, NCC Group, Planit, XebiaLabs and … Sogeti – each of whom had a stand at the event to publicise their offerings and seek to make new contacts within the audience.

I am a Test Consultant and have been working for Sogeti for 4 years now. As I am based in Manchester, I was very pleased to be invited by the Sogeti Marketing team to attend and assist my colleagues manning the Sogeti stand, and generally networking with the many test professionals from all over the country who attended.

I have no sales or marketing experience, and had no idea what to expect when I arrived at 0745 to meet the team and to (hopefully) enjoy the day. The team comprised Andy Ainsworth, Leo Roberts, Anthony Muddembuga and Andrew Fullen. We had a selection of colourful brochures detailing Sogeti services, and also some great freebies – key-shaped memory sticks, excellent chocolates and a coaster bearing the legend “Keep Calm and Trust the Tester” .  So, although I attended a couple of the formal presentations (more of which later) I spent most of my time manning the stand and talking to testers from all over.

The first indication I got about how successful our freebies were was when 3 people came over saying that they had heard that we giving away some really cool coasters (or beer mats as one person insisted on calling them). The coasters were incredibly successful and popular.  I don’t know how many we started with, but there were none at all left when it came time to pack up at the end of the day.  One guy made a point of grabbing a full selection of freebies, and a couple of brochures, whilst telling me that Sogeti had the “coolest” stuff and stand at the event. The memory sticks and chocs went down really well, too – so Marketing outdid themselves in coming up with the giveaways.

The team of professionals were very supportive and welcoming and the team worked really well together.  All of us were greeting delegates as they came over to the stand, establishing what their interest was and handing them on the team member best qualified to assist with their questions. Judging by the great conversations we had,  it proved to be a successful day on the stand.

As mentioned above, there were a number of presentations in the main hall, the first of which being by Sogeti Managing Consultant, Andrew Fullen. This was described in the event brochure as follows:

Case Study 1: App Automation in an Agile World – A real world case study
During this session Sogeti’s Managing Consultant, Andrew Fullen, will present a real world case study focusing on how to successfully move from a manual agile practice for mobile testing to an automated agile framework. He will cover the processes that his team followed and the challenges that they overcame, in order to deliver a successful automated mobile testing project.

Andrew did an excellent job and his session was very well received by the attendees – despite some technical issues. This was emphasised to me when one particular delegate – Dot Graham – made a point of coming to the stand to say how impressed she had been by Andrew and how well he knew his subject.  Now, this may not mean much to some of you, but Dot is one of the leading experts in Software Testing.  Her profile on her website reads as follows:

“Dorothy has worked in software testing for over 40 years. Originally from the US, her first job was developing testing tools for Bell Labs in New Jersey. She then worked as a developer for Ferranti Computer Systems in Manchester before moving to the National Computing Centre where she developed training courses in software engineering. She later became an independent consultant specialising in software testing and founded Grove Consultants in 1988. Over the next 20 years, Grove gained a reputation for the high quality of their training courses, materials and presentations establishing a world-wide client base. In 2008 she left Grove to work again as an independent consultant. Dorothy now uses her experience to help educate and motivate others through conference presentations and tutorials.”

She is co-author of four books: an introductory book for software testing together with more in depth books on Software Inspection and Test Automation, and was a founder member of the ISEB Software Testing Board. She was a member of the working party that developed the ISTQB Foundation Syllabus and in 2012 she was awarded the ISTQB International Software Testing Excellence Award.

I think that the fact that she made a point of coming over to the Sogeti stand to congratulate us on Andrew’s presentation speaks absolute volumes, After loading her down with chocolates and a coaster, I wasted no time in asking Leo to introduce her to Andrew and Dot and Andrew had a long chat.  Andrew can be really proud of himself.

After all of presentations, there were drinks all round and everyone – sponsors and delegates – mingled, chatted and networked.

I thoroughly enjoyed the entire day.  The atmosphere was relaxed and everyone was very approachable and friendly. I felt that we had worked together really well as a team and I was very pleased to have been invited to assist. And when everything is all over, I think it was great to spread the word of Sogeti in the North.


AUTHOR: Mike Young

Posted in: Testing and innovation      
Comments: 0
Tags: , , ,


Image credit: sqasolution.comPeople who have worked on Agile projects before would have probably noticed or heard this: Agile projects require… agility! But again, agility can mean several things. Of course, it involves following an iterative process, responding to changes, improving time to market and many other important aspects.

However, what people often don’t realise when joining an Agile team is that their usual duties will change. Indeed, with waterfall, each activity is clearly identified and segregated: A business analyst will gather requirements from its customers. Then, he/she will transform them into specifications and forward the documents to the developers. The developers on their side will build the solution from the specifications and deliver them to the testers, who in turn will validate the deliverables with the involvement of users/customers.

In Agile development lifecycles, activities and roles are more mixed up. Particularly in Testing, everybody (more or less) has to test! The main reason is to meet the objective of time to market improvement. If you want to deliver new functionalities, right at the end of the iteration, you will have to shorten testing periods as much as possible. And, as customers won’t sacrifice quality, you’ll have to modify the way you test to ensure quality is monitored and controlled during the whole product lifecycle. This requires business analysts, developers and customers to participate in the testing of the deliverables. Nevertheless, they will execute tests at different levels and different times.

First, it is essential that developers run unit tests (manual and automated) of their developments. If not, the number of defects found later in the process will be too high to allow a quick release. A good set of integration tests is also necessary to allow frequent and reliable deliveries. Once new functionalities are deployed in a quality environment, smoke tests can be executed by the tester, either manually or by running scripts. Then, if the quality is satisfactory, the validation process can start:

  • - The business analysts will start verifying what they specified to be actually available
  • - The testers will perform advanced tests (functional, performance, limits, etc.)
  • - The customers will execute the acceptance tests to confirm that the deliverables are aligned with their requirements

This raises some challenges about defects/tests duplicates and detailed planning of who does what and when. These areas will be addressed in my next article.

However, it cannot be denied that, the “everybody tests” approach strongly improves the time to market without compromising quality. It can definitely help optimise your project while increasing agility benefits.

To read the original post and add comments, please visit the SogetiLabs blog: Everybody tests with Agile

Related Posts:

  1. Agile (testing) – the final solution?
  2. Our Top 10 2014: Agile-Testers as the Trojan Horse in Traditional Development Projects!
  3. Agile Test Improvement: contradiction or two sides of a coin?
  4. Agile-Testers as the Trojan Horse in Traditional Development Projects!

Lionel Matthey AUTHOR: Lionel Matthey
Lionel Matthey has been Senior Consultant for Sogeti Group since 2012. In this role, he is responsible for delivering Testing and Project management services to our clients. He is also in charge of the research on good practices about agile testing. Several knowledge sharing sessions have already been organized to present the result of his studies and practical examples of the lessons he learnt in Swiss banks. Lionel is also involved in the preparation of answers to clients’ service requests about testing (RFPs, RFIs…). Previously, he worked for other IT consulting firms as Test Manager, Business Analyst or Project Manager. In 2007, after a few months of testing automation, Lionel moved to Zurich to take over the management of the tests on a large project at UBS. He later came back in the French speaking part of Switzerland to strengthen his skills on many aspects of testing at different clients like Orange, Rolex, Givaudan or Nestlé. In 2010 Lionel accepted the position of Project Manager/Scrum Master on Nestlé’s third and largest agile Project at that time.

Posted in: Business Intelligence, communication, Developers, functional testing, waterfall      
Comments: 0
Tags: , , ,


Sogeti is proud to offer web and mobile testing services through our uk based lab, Sogeti Studio, to support your digital or omni-channel strategy.

As part of this we are committed to researching the current trends in desktop, tablet and mobile devices as well as the operating systems and browsers that run on them. This allows us to ensure we can thoroughly test across the most popular types in each category, whilst also offering the ability to test across more niche types to deliver the test coverage our customers need.

Each month we prepare a report on the current omni-channel market trends, covering:

- Desktop Browsers
- Desktop Screen Resolutions
- Desktop Operating Systems
- Mobile/Tablet Browsers
- Mobile/Tablet Operating Systems
- Mobile/Tablet Service Provider (Device Manufacturer)

We monitor these trends over time with larger, tracking reports every 3-6 months.

Find the January 2015 report here:

For more information about Sogeti Studio, please visit:

AUTHOR: Sogeti Studio
Sogeti Studio is our London-based web and mobile testing lab.

Posted in: Apps, Cloud, Digital, Digital strategy, e-Commerce, Innovation, Marketing, mobile applications, mobile testing, Mobility, Omnichannel, Reports, Research, Smart, Sogeti Studio      
Comments: 0
Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,


Bearings are the technological silver bullet of their time, but we continue to rely upon them today. The methods by which we monitor the quality, integration and performance of bearings should show how we do the same thing with security intelligence.They are old, ubiquitous, transparent and irreplaceable. Just like security intelligence, we rely on them 100 percent of the time and they never fail (well, almost never). They cost nearly nothing to create or obtain, but a failure in their operation, design or implementation can have very expensive consequences. Each copy of the same model must be identical so that customers are assured quality. Their performance must be predictable since they are exposed to some of the most hostile and dynamic environments. They come in many sizes and designs and may be integrated into solutions in many ways, but they must perform perfectly to keep us safe.

What technology is this? A bearing.

Your automobile uses thousands of them. The aircraft and engines that push you through the sky have many more. Shower valves, bikes, treadmills, tape drives and even disk drives use bearings. They are everywhere. They are seamless. We depend on them all the time.

However, when they need to be replaced, it is not cheap. The cost of replacing a bearing in your transmission can be 1,000 times the cost of the bearing itself. Coincidentally, the same claims can be made about security technologies. An unanticipated failure due to software bugs, bad algorithms, incorrect integration or poor design can cost much more than the price of the technology itself.

However, without the technology of bearings, we would not have the high-reliability items that we rely on daily, from automobiles to air conditioners.

Why is it that we have mastered this level of high reliability with the design and implementation of bearings, but we continue to struggle with security, as is evident in the breaches that seem to be on the news nightly?

The reason is that there is more to the story of bearings than just a piece of metal. The secret to success in bearings is not about what they do or that they are made of hard metal. The true magic in bearings is all about the information we collect about how they are manufactured and how they perform every second of the day.

There is a lot we can learn from this 18th-century technology. We need to take those lessons and apply them to the security intelligence challenges of the 21st century. Surprisingly, it all comes down to lessons we should already have learned.

Quality Control

At the end of World War II, William Edwards Deming presented a paper on statistical process control in Japan. Without going into great detail, he highlighted the fact that if you are going to make something of quality, you cannot inspect each copy or widget to ensure it is “good.” You cannot look at millions of bearings. You cannot visually check every syringe. You cannot check every square micron on every platter in every disk drive.

Instead, you need to look at the process that makes these things, not the things themselves. You must know that your product is designed correctly. Furthermore, and perhaps more importantly, you must understand how the product is being manufactured to ensure it is meeting design specifications.

The idea is that if the design is good and the manufacturing process is efficient and consistent, the product coming off the line will always be as good as its design and have no variations. In other words, the focus should be less on inspecting the bearings and more on inspecting the process that creates them.

Bearings are the poster child for Deming’s theories. A bearings manufacturer can produce millions of bearings each year. However, those bearings can’t all be inspected for defects. Instead, the manufacturer must inspect the process that makes the bearings by collecting information on the manufacturing process.

On the other end of the supply chain is the manufacturer of the engine that uses the bearings, but it follows a similar process. The performance of the engine must be measured to know whether the bearings are achieving their required quality. Is the engine overheating? Is fuel consumption low? Are noises and vibrations within predictable limits? If not, you know it is likely due to the design of the engine, the design’s implementation or the components (bearings) used in the design.

However, the actual bearing is never inspected when it is in the engine. Information, intelligence and analysis tell you how the bearings are performing and whether they are meeting the needs of the dynamic environments in which they reside — that is, the engines.

What Does This Have to Do With Security Intelligence?

Similar to the design systems (engines), we never use just one security component in our security designs. The overall security of our systems, networks and infrastructure is not based on a single technology, but rather a culmination and integration of many technologies that must work together and be built with components that are intended to perform in a dynamic environment. Our system design should consider the hostile environment in which it works, and the components that are part of the design must also be designed to work in those hostile environments.

The lesson here is that the quality of the different layers of security is no different than the various layers at which bearing technology is applied. The design and implementation of the components (such as operating systems, firewalls, antivirus and IDs) must focus on eliminating unanticipated behavior (bugs), just like bearings cannot have flaws introduced during their manufacturing process.

Additionally, the integration of those components throughout an enterprise must also be pursued to enforce a desired behavior and eliminate unauthorized access and data leakage. Engines cannot have unpredictable behavior, and neither should our networks.

Bearings are designed, manufactured and integrated into engines to enforce highly predictable behavior. Security components are designed, manufactured and integrated into complex networks to provide the same highly predictable behavior.

Security System Applications

So why are bearings and engines able to do this but security systems cannot? That question can be paraphrased in the context of bearings. How do manufacturers know whether they are producing bearings with the same quality? How do engine manufacturers know whether the bearings are still performing within their engines? The answer to those questions comes down to data collection, information gathering and analysis — otherwise known as security intelligence.

Just like manufacturing and integration, security data gathering, intelligence and analytics let security analysts find the baseline normal behavior of networks. Quality control requires that you baseline your processes to achieve consistency — a necessary step before you achieve quality.

Once the analyst has a baseline of the network’s normal behaviour, anomalies such as unusual bandwidth consumption, rapid login failures, abnormal network connections and peculiar use patterns become more readily apparent.

The demands on our networks are dynamic and hostile, but they should be fairly predictable. It is that predictability and consistency that is a key component in the security of our environments. As we continue to improve predictability and consistency, continue to make regular improvements to our security profile and continue to make changes based on security intelligence and analysis of data that is collected and correlated across a large set of sources, we are able to meticulously close vulnerabilities and proactively minimise the ability for future vulnerabilities to be exploited.

These security benefits are the direct result of security intelligence, analysis and response. This is exactly the process that helps bearing and engine manufacturers produce incredible metrics. History shows us that this has been successful in implementing other technologies that are always there, always transparent, always meeting our needs and always working for us.

If we can do it with technology from the 18th century, we can do it with technology from the 21st century without reinventing the wheel — or bearing, in this case.

The original post by IBM can be found here: What Do Silver Bullets, Bearings and Engines Have to Do With Security Intelligence?


IBMSecurityIntelligence AUTHOR: IBMSecurityIntelligence
Analysis and Insight for Information Security Professionals

Posted in: Big data, Security      
Comments: 0
Tags: , , , ,


Testing is an essential, but time and resource consuming, activity in the software development process. Generating a short, but effective, test suite typically requires a lot of effort allied to expert knowledge. In a model-based process, among other subtasks, test construction and test execution can be partially automated. Model-based testing is defined as follows in the ISTQB Glossary (v2.4. July 2014):

Testing based on a model of the component or system under test, e.g. reliability growth models, usage models such as operational profiles or behavioural models such as decision table or state transition diagram.

So, model-based testing is based on creating test cases derived from a behavioural model of the test object – the (test) model. This model describes the expected behaviour of the test object. These test cases are then, where possible, automatically generated from the test object. The challenge with this approach lies creating a formal behaviour model in which the operation of (part of) the application is represented.

Functional requirements, for instance derived from use cases, can serve as a starting point for the creation of the model. The model used for developing the application or a separate test model can be used. A test modeller creates a test by hand and may utilise a tool that will create test cases that can read this model. A lot of tools use an UML (Unified Modelling Language) model; the tools can also improve the process of designing the model.

The basic idea is that from a formal or semi-formal model, complete test cases (input and expected output pairs) can be generated.

There is extensive research in this field, and the technology has now matured sufficiently enough that there are a number of commercial tools and industrial applications available to assist with developing tests. Details of some of the available open source tools can be found here.

Benefits and Advantages of Model-Based Testing

Model-based testing has several advantages when compared to conventional software test production:

  • - Test cost reduction – Perhaps one of the most important reasons for using model-based testing is that its improved efficiency and cost savings improve the quality of the software, while also reducing overall effort. Defects are also detected earlier in the testing life cycle.
  • - Easy to achieve better test coverage – The use of model-based testing allows engineers to achieve better test coverage, analysis and planning. Based on well-defined models, the system requirement test coverage can be easily measured while a tool is provided to support scenario analysis and prioritisation.
  • - Good documentation – Model-based testing requires the production of detailed documents which can be used as a training tool for new testers in a team.
  • - Reusability of assets – This is another advantage in model-based testing because the model-based test assets can be reused and reproduced easily in regression testing. Whenever software changes are introduced, the model(s) can be easily modified to accommodate these changes.
  • - Better understanding – Creating models increases overall understanding of the system.  Inaccuracies, omissions, inconsistencies or ambiguity in the specifications and requirement documentation may cause problems or additional costs during testing.  The creation of a model by testers serves to highlight such issues earlier in the test process and to detect bugs at an early stage in the development process.

Issues with Model-Based Software Testing

However, there are some issues relating to model-based software testing:

  • - Model quality and standardisation – In order for model-based testing to achieve its full potential, care needs to be taken when developing the models. Test engineers need to have domain experience in order to effectively create appropriate models.
  • - Choosing the right model – There are situations that fit one model better than another and care is required when selecting which model to use. In particular, some models require considerable effort to develop depending upon the level of detail required and the complexity of the domain.
  • - Model complexity – In large projects, trying to model an entire system using finite state machines becomes unfeasible, given the inherent complexity of the problem at hand. State-explosion refers to the huge number of states that will be generated when trying to model a real system at a fairly detailed level.
  • - Automated mechanisms that evaluate test results (called oracles) are crucial to model-based testing. Without them a major value of model-based testing, automation, is almost voided. But on the automation front, the possibilities offered by model-based testing techniques do not offer a solution to the oracle problem. In fact, automation reportedly makes the oracle problem even more difficult.
  • - Model-based testing demands a high level of skills from the test engineers. They not only need to have domain-specific knowledge, but they should also be familiar with the theory behind the models and need experience and / or training in the use of the various automated tools available. Furthermore, quality modelling and the analysis of products depend on engineers’ good understanding of products and having the experience to write the model and choose the test selection criteria.
  • - Model-based testing cannot be used to test all of the quality attributes required for a complete test suite – MBT covers only functionality and some aspects of suitability.  Other techniques will be required to adequately cover other aspects of the test project


Model-based testing is certainly worth consideration for small to medium sized projects, especially where repetition of testing is a considerable factor (e.g. regression testing).

Sogeti includes MBT in its testing solutions

Sogeti provides clients with high quality, cost effective testing solutions, using MBT, automation and other methods.

At the end of 2012, we launched PointZERO – a framework that delivers parallel step-by-step improvement based on an array of measures, methods and tools, leading to business solutions that are fit for purpose and right the first time.

In 2014, we launched the market-leading Sogeti Studio, which provides our clients with an on demand service that can enable them to create websites, and mobile applications, and to test those across a wide range of regularly refreshed browsers, devices and application configurations. Within the Studio we also carry out a number of tool evaluations and our consultants are skilled in a range of testing methods.

Sogeti continue to make significant investment to further develop our reputation as a global testing services thought leader, and we are currently developing model-based testing capability to extend our service offerings.

AUTHOR: Michael Lagdon
Michael Lagdon is a Managing Consultant at Sogeti UK.

Posted in: Automation Testing, Behaviour Driven Development, functional testing, Sogeti Studio, Transitioning      
Comments: 0
Tags: , , , , , , , , ,


What is Omni-Channel?

Omni-channel retailing is an evolution of multi-channel retailing, but concentrated more on providing a seamless consumer experience across all available shopping channels, i.e. mobile internet devices, computers, brick-and-mortar, television, radio, direct mail, catalogue etc. Customers now expect this type of experience and have to meet these demands by deploying specialised supply chain strategy software, updating their testing strategy, and tracking customers across all channels.

The Evolution of Omni-Channel

Way back in time (or so it feels), when companies created their first websites, they created their first digital relationships with their customers. Businesses have a voracious appetite for establishing a market presence that was no longer constrained by bricks and mortar and limited opening hours.

Just a few years later Yahoo and Google enabled a second digital layer – Search – and a multi-billion dollar search industry was born, and companies realized they needed their websites to be ‘found’.

Around 2007, ‘chat rooms’ evolved into social networks, quickly becoming a preferred way for consumers to communicate and discover new products and services – whether reviews were good or bad. There was a social network explosion that brought with it a host of new advertising, commerce and service options.

At the same time, mobile began to pick up steam and so the greatest new commerce channel since shopping centres was born. Combined with social media, mobile presented an unprecedented opportunity to provide the consumer, whenever and wherever they were, with relevant and helpful content. Fast forward a few years and consumers have quickly come to expect a 24/7 buying and service experience.

The Challenges

Many retailers had spent their entire lives thinking about how to build an engaging experience in one channel, usually a store. Now, with web and mobile almost a commodity, retailers have to understand how to connect with their customers, the way those customers want to connect.  Increasingly complex layers of digital connection between companies and their customers are being created. Understanding and mastering these layers, and integrating traditional advertising through a uniform and consistent communication strategy represents new challenges for marketing and retailing. Yet, now, every major retailer needs an omni-channel marketing strategy to survive. Those retailers that keep pace with the latest mobile technologies, and evolve their marketing strategies to embrace them soonest, will be the only ones to maintain and grow their market share.

In order to achieve success in omni-channel, retailers and marketers must look through the eyes of the consumer, optimising their experience across all the channels so that it is seamless, integrated and consistent. Omni-channel anticipates that customers may start in one channel and move to another as they progress to a conclusion i.e. sale. For example, they may see an advert on television, go online to research products, use their tablet to start purchase or add to basket, then use a smart phone to complete the purchase.

Standing out from the crowd in this increasingly complex and cluttered world, across all of these channels is not easy. If the customer experience in each channel is not uniquely helpful, they will become annoyed by repetitive ubiquity. There needs to be a different yet consistent brand approach for each channel.

In addition, marketing is undergoing rapid and major changes. It is shifting away from mass ‘push’-based marketing towards more personalised communication with consumers, through the many channels and on the many devices they use.

Consumers leave a data trail on every level. The companies that can best mine this stream of information will gain a powerful competitive advantage. Analytics is the new key to the kingdom and for many retailers, good data analysis may be the only competitive advantage in a world of instant price comparisons, multiple review sources, and ‘buy it now’ buttons. This data can be used to tailor the consumer experience across all channels. This will drive increased customer loyalty, and with that comes increased sales!

As if developing effective omni-channel marketing solutions wasn’t tough enough, keeping pace with the underlying technologies increases the complexity twofold. Companies need their omni-channel marketing solutions to provide a high quality experience across channels, which means testing across a large number of devices, operating systems etc. In addition, the new technology must integrate fully with their legacy and back office systems.

For marketers and retailers, they not only need to test email subject lines, social channels, content, landing pages etc. Testing the customer experience and technology aspects before going to market requires an ever-evolving set of technical testing skills and resources, and few companies can justify the investment required to provide for themselves in house.

How Sogeti Can Help

Sogeti is perfectly placed to help companies deliver omni-channel marketing with a fully integrated testing solution.

Not only can we provide a core front-end testing capability for Mobile Functional Testing, Mobile Usability Testing, Mobile Performance Testing, Mobile Security Testing and Web Accessibility Testing, we can support end-to-end omni-channel marketing scenarios with user experience research, plus mobile payments using the latest NFC (Near Field Communication) and HCE (Host Card Emulation) technology.

Sogeti Studio – our UK based web and mobile testing lab provides services on demand, and gives you access to all of the the devices and resources you need to test thoroughly, without significant investment. Out of the Studio we also offer Content Management System (CMS) testing, tool evaluations and more. Find details and case studies here.

Sogeti can also support mobile strategy, development and portfolio management, as well as all of your Big Data and analytics and Smart Commerce needs. Mobile details here. Big Data and Analytics here. Smart Commerce here.

To find out more, please email us:, or call us today: +44 (0) 20 7014 8900

Follow Sogeti at:

AUTHOR: Michael Lagdon
Michael Lagdon is a Managing Consultant at Sogeti UK.

Posted in: Business Intelligence, communication, Digital, Digital strategy, e-Commerce, Marketing, mobile applications, mobile testing, Mobility, Omnichannel, Rapid Application Development, REWARD, Security, Social media, Social media analytics, Software Development, Software testing, Sogeti Studio, User Experience      
Comments: 0
Tags: , , , , , , , , , , , , , , , , , , , , , , , ,


software PM servicesIn my previous blog post Wanted (or not): new design principles, I referred to the discussions I’ve been engaged in, recently, on whether or not architects need new design principles. In the context of such new design principles, I suggested that we need more and smaller building blocks.

Last month, I had a highly inspiring conversation with a fellow researcher at the Utrecht University of Applied Sciences. He had just returned from a study trip, during which he had met with executives from the big players of Silicon Valley such as Amazon, Netflix and Facebook. These companies are completely organised around business services. Each business service is owned by a small multi-disciplinary team and are updated continuously. Here, we are talking of hundreds of updated services each day! Flexibility is achieved, because business services are built from other, very small services. Services are published and you can find the one that best fits your needs by searching for relevant data. The team is very much the owner of the technology that they want to use. Standardisation is restricted to communication between services. To me, this is how services should be.

An additional advantage of taking to the idea of business services, is the fact that this stimulates designing from the customer perspective. In one of my earlier blogs, I discussed that the customer perspective is becoming increasingly important for architects, but the focus of many architects is mostly on efficiency of internal processes. If architects shift their focus to business services as a starting point for architecture, the customer may finally come into the picture.

All of this fits the shift (please refer to the table below) that we identified in one of our recent architecture discussion sessions:

Focus is on … Instead of …
Designing services for use (to achieve flexibility and speed) Reuse
Points of contact Shielding
Realizing value for the customer Internal efficiency
Architecting trust into the interaction Transaction, production and delivery, which are separated
Digital value (use) Digital asset (possession)


To read the original post and add comments, please visit the SogetiLabs blog: The service is dead, long live the service!

Related Posts:

  1. Projects are dead!
  2. At your service…
  3. Testing is dead, long live the Tester.
  4. Wanted (or not): new design principles

Marlies van Steenbergen AUTHOR: Marlies van Steenbergen
Marlies van Steenbergen started her career with Sogeti Netherlands in the role of service manager enterprise architecture in 2000. After working as a consultant for a few years, she became Principal Consultant Enterprise Architecture in 2004. In this role, she is responsible for stimulating and guaranteeing the development of the architectural competence of Sogeti Netherlands. Since 2012 she is the main proponent of enterprise architecture and DYA within Sogeti Netherlands.

Posted in: architecture, communication, Innovation, Opinion      
Comments: 0
Tags: , , ,


New changes to the Bank Secrecy Act have been proposed by the Financial Crimes Enforcement Network (FinCEN) in order to clamp down on money laundering for financial institutions. These changes have been debated for some time across numerous departments.The Financial Crimes Enforcement Network (FinCEN) has announced proposed changes that would amend part of the Bank Secrecy Act (BSA). According to the National Law Review, the changes affect customer due diligence (CDD) requirements for certain covered financial institutions. These include mutual funds, brokers or dealers in securities, future commission merchants and introducing brokers in commodities. Comments and feedback on these proposed changes are due by Oct. 3, 2014.

Details of the FinCEN Proposal

The proposed changes to the BSA add more requirements to anti-money-laundering (AML) programs and customer identification programs (CIP) in the form of CDD requirements. These CDD requirements would apply to all covered financial institutions under the USA PATRIOT Act. They force customers to document beneficial ownership for their legal entity (i.e., mutual funds and brokerage accounts) and codify the requirements.

FinCEN’s proposed rules to revise the current AML requirements for CDD address the following:

  • Identify and authenticate a customer’s identity, which is currently a requirement of the existing CIP rules.
  • Identify, authenticate and understand beneficial owners of a legal entity (i.e., an association, partnership, proprietorship, corporation or trust).The rule states that a beneficial owner will be anyone with a 25 percent or more equity interest of the entity or has significant management responsibilities within the entity.

FinCEN is also proposing an update requiring a fifth pillar to AML compliance. This pillar would address CDD and would require covered financial institutions to understand the use and purpose of their customers’ relationship, and implement ongoing monitoring.

Currently, the pillars are:

  1. Designate a compliance officer.
  2. Development of internal policies, procedures and controls.
  3. Ongoing and relevant training of employees.
  4. Independent testing and review.

Analyst Comments

These proposed CDD requirements have been a widely discussed topic for both U.S. and international law enforcement and regulatory agencies for quite some time. Fraudsterscriminal organisations and terrorists are known to abuse legal entities for their advantage. Having the ability to identify individuals who own these legal entities and do business within the U.S. financial system will greatly assist in reducing this type of abuse.

FinCEN’s first publication regarding the proposed CDD requirements was released in March 2012 and set the stage for coding and enhancing these CDD requirements. The current proposal is partly a product of the 2012 regulatory process and collaboration with other interested regulatory agencies (the Office of the Comptroller of the Currency, Federal Reserve Board, Federal Deposit Insurance Corporation, Securities and Exchange Commission, and Commodity Futures Trading Commission).

If approved, this proposal would identify beneficial owners of legal entity customers and add this CDD component as a fifth pillar to BSA/AML programs.


IBMSecurityIntelligence AUTHOR: IBMSecurityIntelligence
Analysis and Insight for Information Security Professionals

Posted in: Security      
Comments: 0
Tags: ,


The webinar on “The Experience of Things” will be held on Monday, February 16 from 5:00 – 6:00 PM CET (11:00 AM-12:00 PM EST, 9:30-10:30 PM IST). The speaker for the session is Joo Serk Lee.

Block your calendars now for an insightful session!

About the topic

From green screens to the Internet to smartphones and wearables, designing how people interact with technology has never been as important as it is today…or as challenging.  During the session, Joo will take us through a brief history of our interaction with computing devices, emerging themes in the balance between design and technology, and why maintaining this balance will continue to stretch practitioners of both in new directions in the age of the digital enterprise.

Instructions on how to join the webinar

  1. Click on the following link: (from your PC, Mac, iPad or Android tablet)
  2. Register with your name, location and email address
  3. Click on “use my computer” to see the content we will share
  4. If requested, allow the system to use your webcam/microphone
  5. Only presenters are able to speak at first. If you want to speak, click on the unmute button next to your name in the attendees list. You can also write live comments in the chat box.

The webinar will be recorded and made available later.

To read the original post and add comments, please visit the SogetiLabs blog: Coming up: Webinar on ‘The Experience of Things’

Related Posts:

  1. Coming up: Webinar on ‘The Recover Approach’
  2. Webinar: Uniting Robotics and IT Testing
  3. Experience Your Magic Moments with Big Data
  4. DotCom 2.0? The Perils of Our Focus on User Experience


Joo Serk Lee AUTHOR: Joo Serk Lee
Joo Serk Lee is a Vice President in Sogeti USA who has spent the past two years architecting major programs at Aspen Marketing Services and at Abbott Laboratories. He in an Enterprise Architect by trade and has spent much of his 15 year career working in and crafting transformation programs featuring complex technologies across a wide range of technologies including Microsoft/Java stacks, mobility and large CRM and ERP solutions.

Posted in: wearable technoloy, Webinars      
Comments: 0
Tags: , , , ,