The requirements of MiFID II & MiFIR II (The Markets in Financial Instruments Directive and Regulation) will result in big changes in the way that Trading firms are required to test their trading algorithms on both the buy and sell side. The repercussions for a breach are pretty severe, with financial penalties for firms of €15 million or 15% of turnover and the legislation holding senior management explicitly responsible to the tune of up to €5 million where a lack of testing causes or contributes to market disorder and up to 4 years imprisonment for an intentional breach! So, with the deadline not far away on 3rd January 2017, what are the testing requirements and challenges how can Trading firms ensure that they are compliant in time?

Algorithm Testing

The main requirements that affect algorithm testing are:

  • Testing that algorithms remain stable under stressed market conditions
  • Ensuring that individual algorithms do not cause or contribute to a disorderly market when interacting or under pressure from other antagonistic algorithms
  • Implementing system and risk controls to ensure sufficient capacity and resilience and applying threshold and limits that prevent spurious orders.
  • Ensuring algorithms do not behave in a way that is contrary to the relevant Trading Venue rules as well as the legislation.
  • Implementing a business continuity strategy in the event of system failure to minimise and prevent disruption.
  • Trading Venues must provide adequate testing facilities to ensure compliance.
  • Identifying different algorithms and their uses and the individuals using them.
  • Certifying and showing evidence that the algorithms have been tested, are stable and are unlikely to cause disorderly markets and flash crashes.


The main barrier to compliance is that this requires a completely new approach and it’s extremely challenging to reproduce real market conditions in a non-live, controlled test environment to enable testing the algorithm in both normal conditions and under stressed conditions. It requires a shift left approach of testing prior to deployment as well as continuous testing as new algorithms are deployed and integrated. Trading Venues also now need to invest in creating high functioning IT Organisations with a cutting edge IT Security and Security Testing strategy that also encompasses 3rd party providers. Building the kind of test environment needed to meet the MiFID II requirements is expensive and complex and, as it requires specialist QA and Testing expertise, the gap in the sills market in this sector will also need to be addressed fast. Data integrity is the key to successful testing, and it’s particularly difficult to reproduce accurate data due to the constant dynamic ebb and flow of the markets.

Test Strategy

  • Select a partner who specialises in Testing in the Financial Sector who can create a comprehensive managed test service that emulates live market conditions with a dynamic response, in a cost effective way.
  • From the outset agree market disorder pass/fail levels in alignment with MiFID II and the rules of the individual Trade Venue.
  • Ensure that there is an agreed mechanism for trade desks to be able to share policy changes to keep testing ‘real time’ with respect to policy and MIFID compliance.
  • Test tooling needs to be at the front of the strategy to make sure that the tool can scale, can capture high transaction and can be used in a rapid deployment scenario.
  • Source all data directly from the exchanges to ensure its integrity.
  • Carry out Performance, Functionality and Stress Testing
  • Create as many different scenario based tests as possible testing the impact of antagonistic algorithms.
  • Create a comprehensive reporting and analytics system that is fully aligned to MiFID II and the individual Trade Venue rules.

Collaboration is Key

At the FIX Trading Community conference, several industry insiders quite rightly said that collaboration is the key to successfully navigating these arduous requirements. The debate threw up possible solutions such as open access to clearing, trading and central securities depositories to minimise costs and the creation of a cross-market utility owned by both the buy and sell sides to produce the required quality of post-trade data. It’s not only the Trading Venues who are feeling overwhelmed by MiFID, as Edwin Schooling Latter, head of markets infrastructure and policy at UK regulator the FCA told Elliott Holley of last year, “We at the FCA have to monitor new transaction reporting flows, have a mechanism to monitor position limits, do transparency monitoring – so it’s not just market participants who will have to meet these new requirements. We will, too.”

For a more in depth discussion around your testing requirements in this area, please connect with me or send an email to

Daryl is a Delivery Director at Sogeti UK, primarily working with financial services clients. He is both a mobile and an agile subject matter expert.

Posted in: Business Intelligence, functional testing, Human Interaction Testing, Infrastructure, IT Security, IT strategy, Managed Testing, Quality Assurance, Requirements, Research, test data management, Test environment, test framework, Test Tools      
Comments: 0
Tags: , , , , , , , , , , , ,


Bimodal IT

In 2014 Gartner coined the phrase “Bimodal IT” to describe the need for IT organisations to adopt the dual functions of “Rock Solid” traditional IT and “Fluid Digital” IT for continuous innovation, working at 2 different speeds to seize the opportunities of Digital Transformation while maintaining day to day operations. Whilst not entirely new, this concept is currently creating a buzz in the IT industry, which means that we need to examine the changes that it will bring about in the context of QA and Testing.

The emphasis for Type 1 Traditional IT is firmly on going slow and getting things right, being risk averse and ensuring efficiency, security and safety. Type 2, Fluid IT, is dependent on using Agile methods in a DevOps environment with the emphasis on innovation and rapid delivery, while being closely aligned to the needs of the business and the end user. This may go so far as to allow the end user to take the initiative on how technology should evolve.

Agile Development and DevOps Principles

Gartner fellow and VP, Daryl Plummer’s research showed that 45% of CIOs already operate a branch of Fluid IT and Plummer predicts that by 2017 75% of IT organisations will have a fully fledged Bimodal capability. Given that 54% of this year’s World Quality Report respondents have already adopted Agile development and 67% of all respondents are using DevOps principles, and most organisations are also still running waterfall projects in more traditional environments, it seems that businesses are already naturally working in 2 modes. However to become truly Bimodal in the way that Gartner means, requires a change in thinking, a conscious decision to completely embrace these dual ways of working and see them as a long-term collaborative balancing act that enables the totality of the business’ needs, rather than a transitional phase.

Collaboration or Conflict?

The concern that has been most widely voiced is that the two functions will be in conflict, jostling for position and resources and that Traditional IT will feel as if they are being edged out. In fact, the Bimodal model relies on the opposite situation, where both functions realise that they are essential to the success of the business and traditional IT takes on an even more important role than in the current focus on “transition” to Agile and DevOps. Traditional IT could actually be less stagnant, because rather than potentially becoming obsolete, they will remain a critical function. This means that they have to adapt to the fact that, while the Fluid IT function is innovating so fast, what was traditional yesterday will be old news tomorrow and their function will also be changing, albeit much more slowly. In a LinkedIn piece last year on Bimodal operations James Wilson, founder at Aria Technologies, suggested that “The ideal solution would be to enable software and system changes – both prescribed and Agile – to converge into a single process; I call this the ‘Bi-Modal Convergent Development Life-cycle’, or ‘Bivergent’.”

Testing in an Ultra-DevOps Environment

So where does testing fit into this ultra-DevOps environment? Agile and DevOps have already built bridges to previously siloed disciplines like QA and Testing, creating a Dev-QA way of working. We will continue to see a rise in the trend for testers to be more customer-focused, working with development and business analysis teams, with an emphasis on shifting left to enable Test Driven Development. The World Quality Report predicts that there will also be a “shift right” due to the integration of development and operations now incorporating continuous quality aspects across these disciplines for continuous deployment and the resultant need for continuous testing. Currently 50% of WQR respondents already deploy continuous testing techniques, 29% plan to use it imminently and only 21% state that it is not in their immediate strategy plans. DevOps will need to mature faster than at its present rate as currently 51% of organisations feel that their DevOps methodology isn’t sufficiently well defined.

To support the Bimodal model, Testing and QA teams need to avoid becoming a bottleneck and instead use a combination of traditional testing methods and continuous testing, virtualisation, predictive analysis, continuous feedback and high levels of automation to become the enabler of this new level of ultra-DevOps rapid deployment.

Daryl is a Delivery Director at Sogeti UK, primarily working with financial services clients. He is both a mobile and an agile subject matter expert.

Posted in: Automation Testing, DevOps, Digital strategy, Infrastructure, Innovation, IT strategy, Quality Assurance, Research, Security, Software Development, Technology Outlook, User Interface, Virtualisation, waterfall, World Quality Report      
Comments: 0
Tags: , , , , , , , , , , ,


Throughout industry, whether business led or IT led, the mantra still seems to be if you are not ‘agile’ then you can’t survive or rather succeed in today’s market place. But what does ‘agile’ mean? In my role, I speak to numerous companies, whether in a business development capacity or networking at industry events or after presenting at conferences. The views of the World are polar opposite, sometimes disingenuous or contradictory to manifesto guidelines. Some of the questions and statements that I hear on a regular basis include:

  • “Do you develop products or technology in an agile way?”
  • “Does your company adopt business agility models?”
  • “Are you a Scrum fan or Kanban? “
  • “Actually we operate lean principles “
  • “We’re not quite there yet, so we are more scrumfall than anything”
  • “It sounds good, but it’s just so difficult to be heard”
  • “We have a daily stand up every day”
  • “The business/IT Management just don’t get it”
  • “The pressure of multiple releases is just intense”
  • And my personal favourite “We’re so agile, we go beyond agile”

The objective of agile is no different to the objective of more traditional approaches I.e. a working product (Quality), delivered on or under budget (Cost) in an acceptable timeframe (Time) to gain advantage over competition, by increasing company reputation and trust as the ideal solution provider.

There is a lot of talk in blogs regarding how ‘agile’ must adapt to keep up with the ever changing demands of the marketplace. I’m not sure that I completely agree with that, from the examples above there is already confusion in the marketplace, so change for the sake of change may make the gulf in ‘maturity’ even wider. A good example of change, is the kudos that DevOps brings. If you ask 10 different people how to describe agile, you’ll get many different views. If you ask the same people to describe DevOps you’ll likely receive more than 10 views, as was proven in my office when we discussed what our testing offering could offer to a DevOps environment. Everyone had a view about what it was, though the biggest consensus gained was how to spell it!

For me, that is the bottom line, if your project, company or industry wants to operate in an agile way or adopt a certain flavour of agile, you need to lay the foundations, in affect define your individual manifesto and publicise it

“Working software over documentation” could translate to “working software with just enough documentation to meet our regulatory demands”

I could go on, but I think the message here is clear, to be agile, you need to collaborate. To collaborate, you need to know what you are working towards, define a top to bottom understanding within your organisation or project team to enable change, transform your approach, and deliver success, and if you want to badge this as ‘agile’ that’s fine too.

Working for a Global consultancy like Sogeti UK, where the majority of our business is in the testing space, when we are asked to assist with introducing agile testing to engagements, we look at the bigger picture. Trying to understand the landscape of the project and the company will determine the type of consultants that we deploy to the engagement. If a client needs to set their own manifesto, there isn’t much reason to deploy highly technical testers, as reputation could suffer if collaboration or rather the mechanics of the project are not smooth and delays occur or releases dates missed as the end game wasn’t known. Instead we’d look to utilise more rounded consultants that can embed into teams and coach and steer the team or company to defining “what looks good for us?” Setting the manifesto and expectations for all to understand and commit and collaborate too.

At Sogeti, we look to help companies achieve their own manifesto, through the use of workshops, involving senior stakeholders through to junior testers, with the objective of aligning objectives, and embedding consultants to assist companies with their transformation journey, to define an individual working ‘agile’ approach, irrespective of ‘flavour’ applied. At every step of the way, looking to support the needs as maturity I.e. Success is gained, to move to the seeming utopia of DevOps (I think that was the agreed spelling!)

If you’d like to find out more about the services Sogeti UK can offer, visit our website or drop me a line.

Daryl is a Delivery Director at Sogeti UK, primarily working with financial services clients. He is both a mobile and an agile subject matter expert.

Posted in: Agile, Behaviour Driven Development, Business Intelligence, communication, Developers, DevOps, IT strategy, Scrum      
Comments: 0
Tags: , , , , , , , , , ,


Having spent two days at London’s ExCel conference centre at AppsWorld; it was great to see the level of advancement and constant change that is prevalent across various industries. Technology is playing an ever increasing role in industry, and the panel discussions that pulled various contributors from different areas was a great insight to how similar the challenges of keeping up with the consumerisation affect is having on IT programmes.

The conference focussed on open content areas; such as:
–          Developer World
–          Droid World
–          Cloud World
–          Connected Car
–          Gaming World
–          Enterprise World

In addition there were a series of premium tracks that covered a wide array of topics:
–          Mobil Payments & Retail
–          Mobile Strategy & Marketing
–          TV & Multi-Screen Apps
–          API Strategies
–          Wearable App Tech
–          HTML5 Workshops

The use of APIs

As I mentioned, the panel discussions were very beneficial. On Day One, listening to the discussion around “Exploring the business value of APIs – Opening data as a channel for Innovation” was very thought provoking.  With speakers from councils, retailers and product companies; there was a very balanced feel to proceedings. The main takeaway from the session centred on the consistency and availability of message transport. Oliver Ogg (Product Owner of APIs for M&S) focussed on how the company are not only providing digital solutions for their customers; but also, providing solutions for their staff to use in-store to ensure there is a ‘single source of proof’ for customer enquiries. The digital; omni-channel experience; though focussed on the consumer, needs to consider how staff interact with consumers. How the in-store display of message is translated to the consumer on their mobile device or at home on their desktop is key to converting enquiries into product sales.

API conversation; specifically publically available APIs were present across multiple tracks over the two days. Companies are making APIs available in the public domain to encourage innovation in the market. Providing the tools (or guidelines) for developers to be creative in designing new or better ways of completing transactions is actively being encouraged. This was epitomised for me during the open track session presented by Mark Dearnley; HMRC’s Chief Digital and Information Officer. During his session; Mark provided an outline of the Government Digital Strategy, and how over the course of 2015, HMRC’s APIs will be made publically available for developers to ‘make things easy’. HMRC have no desire to control the market; preferring to adopt a natural selection process.

What this means is that if we take the example of Self-Assessment (SA); over the course of 2015; there may be a number of privately developed apps; across mobile platforms that attempt to make SA submission more efficient. During the course of consumerisation, only the best apps will survive. Natural selection will ensue, as app store ratings take effect. Only the most user friendly and easy to use apps will survive, thus reducing the need to control the marketplace. As the APIs are in the public domain; HMRC can control the integration.

CRM Strategies, Push Notifications and App Usage

The session hosted by Patrick Mareuil, the Chief Innovation Officer of Accengage; provided a very good overview of the brand loyalty of consumers with respect to app usage. Some highlights of the statistics shared, provide interesting reading:
–          20% of users access mobile apps only once
–          40% of users access the app between 1 and 3 times
–          40% access apps 11 times or more

These statistics are interesting, as looking at them at face value; we can see a lot of missed potential in terms of consumer engagement.

From a testing perspective, the overview of the way push notifications are used outlined a number of use case scenarios that companies; such as Sogeti can assist with productionising apps ready for general use.
Concentrating on the right message, at the right time, in the right place on the right channel is an important feature of maximising conversion of message. From a testing perspective, being able to replicate these scenarios; will provide customers with the right data to complement their digital marketing strategy. As with all approaches, there is a fine-line between optimising the interaction with customers and over-kill. Too much interaction and prompting the user, can also have a negative effect on a consumers willingness to buy.

In addition, the way in which this marketing activity interacts with other applications on the mobile device should be considered. If a user is playing level 105 of Candy Crush, and at that key moment of completing the level, a push notification interrupts their enjoyment, this again could cause negative feedback. Balancing the need for interacting and promoting offers with not interfering with the consumers day to day use of smartphones will need to be covered by the test scenarios that constitute the scope of a release.  Throw into the equation the different approaches to notifications across device platforms, and the scope of testing increases exponentially to ensure the maximum consistency of message across platforms to keep the user experience standard.

Proximity Beacons

A number of the tracks either showcased or made reference to the use of beacon technology as a means to delivering up to date messages and special offers to customers based on their location within a store or theme park.

The use of the technology does; in my opinion counteract the intrusive nature of the audience; as the consumer will be captive and in the right mindset to take on board the advertising messages. Some of the challenges that were outlined during the various talks centred upon the proximity challenges of the technology. In an expansive space; such as a theme park, there are unlikely to be beacons that interfere with each other’s transmission; however; how do companies ensure that in relatively small retail stores, the use of beacons is appropriate and displays the right message at the right time?

It was this example that outlined to me one of the key challenges to the testing of the beacon. How do you replicate on a large scale? If you set-up a test lab, with a selection of beacons; do you lose the desired proximity locations in the live; store environment? Is it sufficient to test using a small selection of beacons, to conduct interruption testing scenarios?

This is a very real consideration that companies need to consider when introducing new technologies to their digital marketing strategies.

Wearable Technology

References to the Internet of Things and wearable’s, brought some interesting viewpoints; but for me the best and perhaps not unsurprising summary of this area came from the session on “Privacy and the adoption of wearable technology”. In this session, the key message was that most; if not all policies around data security and protection apply to all devices. Securing the transport of message from back-end system through to ‘thing’ must follow the same policy and legislation.

For me; the same can be said in terms of the development of the ‘thing’ and also the testing of such devices. Validating the message transport, identifying weaknesses and vulnerabilities remains the challenge. Validating the display and user experience will require testing; developing Omni channel automation frameworks that maximise coverage, whilst controlling the amount of maintenance will appeal to companies as the industry matures. This is certainly a key area of development that I am overseeing in the Sogeti Studio.  In the coming months; we at Sogeti hope to be able to demonstrate these service innovations, to provide customers with an alternative approach to the current mode of operation.

The rise of Crowdsourcing – Testing device compatibility

Device fragmentation; specifically within the Android landscape, raises a number of challenges with regards the age old question of “How much testing is enough?”
Speaking to a number of companies; including app development agencies at the conference the challenges were very similar. How do you make sure that the apps released are compatible with the devices? Some of the answers were the use of emulators, to provide the breadth of coverage complimenting this with a top 10 physical device list to perform the depth of coverage. Others mentioned the reliance on crowdsourcing, booking slots to open up the scope of testing on real devices, seems to be becoming a popular supplementary approach to release testing.

When we add in to the mix operating system platforms and screen resolution, there needs to be a more robust way of achieving the right level of quality. Tools vendors need to look at ways of replicating the user interactions in a standard manner, to provide options in the marketplace.

All in all, the conference was very thought provoking, and has certainly provided a number of takeaways regarding how we at Sogeti can answer some of these challenges through the extension of the current offerings within the Sogeti Studio and developing models that can improve testing coverage on devices, through complimenting emulation with physical device testing and creating Omni channel automation frameworks that promote efficiency in the test cycles.

Daryl is a Delivery Director at Sogeti UK, primarily working with financial services clients. He is both a mobile and an agile subject matter expert.

Posted in: API, Crowdsourcing, Developers, Innovation, Wearable technology      
Comments: 0
Tags: , , , , , , ,


The original principles behind the Agile Manifesto were created to enable, amongst other things the “continuous delivery of valuable software” with software projects undertaken by teams deployed in a single location. As businesses seek to reduce costs and increase the talent pool of skilled experts, distributed software engineering has become increasingly popular. Although the principles still apply when your project teams are located in different regional or global offices, achieving the same level of agility requires a more customised and focused agile approach. When your teams are in different time zones in countries where the culture, attitudes and leadership style vary hugely, effective coordination and communication is no mean feat.

The benefits of achieving global agility, such as easier access to a wider range of technical resources and talent, increased scalability and higher quality, make it well worth the initial input into organisation and strategy, but there are several challenges that first need to be overcome.

 Challenges of Distributed Agile Delivery:

1.       Visibility

It’s essential that all team members are aware of the status of every iteration – otherwise team commitments, with regards to the delivery of valuable software, may end up with missing acceptance flows, leading to defects in the integrated  system; impacting the quality of the product and delaying release and delivery. When your team is distributed this requires a properly configured and customised tool; with simplicity regarding the respect to workflow to allow real time status updates to be viewed by all and to promote collaboration. Ceremonies such as planning, will require such an approach, to ensure stories are complete, and can be fully tested.

2.       Infrastructure

Perhaps your team has been using agile methods effectively for a few years so the move to distributed agility feels like the next logical step, but what if some of the teams you are working with globally have never worked in an agile way before? If they are used to a waterfall method of working the leap to distributed agility is going to be huge. Even if all teams involved have been working according to agile principles; it will be imperative to understand the frequency of releases, to ensure feature delivery accounts for all dependencies. Providing a roadmap, to clearly and simply illustrate this will raise awareness, understanding and promote collaboration between teams and geographies. 

3.       Communication

Business culture varies from country to country and company to company and not only is it difficult for teams who speak a variety of different languages to communicate, the subtleties and nuances of words and context means they can just as equally be divided by a common language! In addition to a different first language they may also use different words in their agile vocabulary. Large time zone gaps render face to face meetings nigh on impossible and can delay vital feedback, reporting and analysis, while culture clashes and different leadership styles can lead to a lack of understanding and trust, all of which can adversely affect quality and output.

 4.       Coordination

Distributed teams that are only separated by a few hours and overlap by a good proportion of the working day, should be able to coordinate their work fairly effectively. However, if some teams only overlap by a couple of hours then you’ll need to work out a collaboration schedule that maximises real time communication and feedback cycles. This can lead to complex scheduling and to some teams being required to work outside of the normal working day.

 5.       Documentation

One of the benefits of agile is reducing time and cost by getting rid of extraneous documentation. When your teams are distributed far and wide your documentation tends to increase, as does cost. The use of document repositories and version control is key. Illustrative documentation over the written word is also beneficial.

In order to achieve distributed agility, it’s essential to synchronise all of these elements and integrate your teams so they can work together towards a common vision with the same goals and objectives.

Here are 10 tips to achieving distributed agility:

1.  Try agile locally first so it’s established in the culture rather than going for global distribution with your first agile project. Create a hybrid, customised agile model that works for all teams when you come together.

2. Ensure all tools, particularly for build and test automation, are standardised and consistently configured and applied by everyone involved.

3. Enable knowledge sharing throughout distributed teams. Have regular knowledge sharing meetings and ensure that there is a secure back up plan in the event of redistribution, people leaving or people joining.

4. Create a simple but comprehensive email, video, phone and messaging communications strategy that links to a central record system so that information is neither missed out nor duplicated.

5. Create a Team of  Teams with at least one representative from every geographically distributed sub team as a member and get them to meet at the initial stage of the project at the central project location to work out the key strategies and processes we are discussing here.

6. Ensure quality with a clear definition of done. Enhance the planning process to ensure that all backlog items cover the required functional and non-functional elements. The team owns the quality of the software,  by encouraging a global quality focus with collective ownership, the adoption objectives should be achieved.  

7. Create a governance team with representatives from each distributed team to create a roadmap to success with appropriate goals. Make sure everyone knows the answer to the question “What does done look like?”

8. Protect your ROI and eliminate technical debt by creating architectural prototypes to highlight potential issues and identifying your architectural strategy at the earliest possible stage with room for flexibility in case of a sudden change in direction – this is after all an agile project!

9. Automation is key for distributed agile delivery as it speeds up processes such as testing so that one team is not left waiting for another to complete a task. It also standardises processes which has a positive impact on internal and external quality.

10. Collaboration in all things is the key to success, from using carefully configured tools that allow real time interaction to creating clear definitions for individual and team responsibilities, a common language, working practices, expectations and tracking progress.

If you need assistance to successfully set up and manage distributed agile delivery, Sogeti offers a full range of flexible agile services including adoption and roadmap workshops, agile development and testing, and agile training and consultancy – all aimed at enabling businesses to reap the benefits of agile within short time frames. Together with our Rightshore® optimised distributed delivery model, and client experiences, we  can most definitely help you achieve success in your distributed agile projects.

Daryl is a Delivery Director at Sogeti UK, primarily working with financial services clients. He is both a mobile and an agile subject matter expert.

Posted in: Agile, Collaboration, communication, Innovation, IT strategy, Opinion, Quality Assurance, Scrum, Software testing, test framework, Test Methodologies, Testing and innovation, Transformation, Transitioning      
Comments: 0
Tags: , , , , , , , , , , , , , , , ,