SOGETI UK BLOG

I’m sure many of you will remember the days of 8-track cassette players, VHS recorders, black and white televisions, server rooms as big as mansions, laser discs as large as dinner plates, floppy discs, green screens, single-lens reflex (SLR) cameras and so on…

I remember growing up in 1960’s Paddington, my dad trained as an Electrical Engineer and was employed by the London Underground as a Senior Engineer. He was one of the first, both amongst his friends and on our street, to purchase a black and white television and a radiogram, both manufactured by Rediffusion. I’m pretty sure that the generation after ours – and even some that grew up in the 60’s/70’s – do not know what a radiogram is, let alone know what it looks likes.

I remember the Rediffusion televisions clearly: brown, wooden boxes on four legs, several knobs and in-built extendable aerials. To get a fairly good picture and sound from the television, you had to strategically adjust the two prong extendable aerial on the top on the box. In the evenings and at weekends, our Paddington apartment was full with my parents’ friends and their children, who would come over to catch the latest news, gossip and TV shows. If we kids were lucky, we got to sneak in and watch a few episodes of Doctor Who, Lancelot Link the Secret Chimp, Batman and Robin, or even Top of the Pops!

Transmission in those days was pretty much residual to both television and radio, with no handy remote controls! One had to physically walk up to the appliances to select the channel from one of three options (BBC1, BBC2 or ITV, tune the frequency, turn the volume up or down, and adjust the aerial. In the years that ensued, indoor and external aerials were manufactured and I could tell many a tale about this, but will reserve this for another blog.

Computers in those days were large caskets with servers as big as houses and several cooling devices. Computer storage capacities we have today – such as megabyte, gigabyte and terabyte – was pure science fiction. In fact, the first hard drive to have gigabyte storage capacity was as large as a refrigerator and this was as recent as the 80’s!

Memory Lane – Storage devices from early computer era

This great website shows the history of computer data storage in pictures. It covers:

  1. The Selectron tube had a capacity of 256 to 4096 bits (32 to 512 bytes). The 4096-bit Selectron was 10 inches long and 3 inches wide. Originally developed in 1946, the memory storage device proved expensive and suffered from production problems, so it never became a success.
    1. Punch Cards – Early computers often used punch cards for input both of programs and data. Punch cards were in common use until the mid-1970s. It should be noted that the use of punch cards predates computers. They were used as early as 1725 in the textile industry (for controlling mechanized textile looms).
    2. Punched Tape – Same as with punch cards, punched tape was originally pioneered by the textile industry for use with mechanized looms. For computers, punch tape could be used for data input but also as a medium to output data. Each row on the tape represented one character.
    3. Magnetic Drum Memory – Invented way back in 1932 in Austria, it was widely used in the 1950’s and the 60’s as the main working memory of computers. In the mid – 1950’s magnetic drum memory had a capacity of around 10kB.
    4. Hard Disk Drive – The first hard-disk drive was the IBM Model 350 Disk File that came with the IBM 305 RAMAC computer in 1956. It had 50 24-inch discs with a total storage capacity of 5 million characters (just under 5 MB).The first hard drive to have more than 1 GB in capacity was the IBM 3380 in 1980 (it could store 2.52 GB). It was the size of a refrigerator, weighed 550 pounds (250 kg), and the price when it was introduced ranged from $81,000 to $142,400.
    5. The Laser Disk – This was a pre-cursor to the CD-ROM and other optical storage solutions. They were 30cm in diameter and were used mainly for movies and had entirely analogue content.
    6. Floppy disk
    7. Magnetic tape

As the years passed and technology evolved, computers and gadgets got smaller and could handle larger amounts of data. Now there are storage devices the size of cigarette boxes that are capable of holding terabytes of data. Gone are the server rooms as big as castles; gone are the days of having to walk up to your radio or television to switch them on, change the channel, control the volume or tune the sound.

Our computers are now relatively small and capable of holding vast amounts of data; in fact you can now watch your favourite TV shows and movies on your mobile devices at very high resolution. Can you imagine carrying a 21” television along the high street to watch a program that you missed? Also gone are the days when individuals carried boom boxes or ghetto blasters on their shoulders to listen to their favourite songs.

Today, televisions come with ‘flat’ or ‘curved’ screens with Ultra High Definition, capable of displaying 4K resolution without pixilation. There is no such thing as vertical / horizontal hold to adjust, no analogue aerials to adjust and receive high quality transmission of sound and picture, no mono-sound. We have hundreds of channels to choose from, and a variety of providers to subscribe to like Sky, Virgin and BT.

Credit cards mean we no longer have to carry hundreds of thousands of pounds in a suitcase, and there’s certainly no requirement to wait 4 days for the clearing office to honour your cheque when you can easily do an electronic transfer online via your bank to another account.

In this wonderful world of technology – whatever is next? What is going to be the next gadget or technology to blow our minds away? VR, AR and the Internet of Things (IoT) are certainly leading the way at the moment, but with such fast paced evolution, who knows what the world of technology will look even 5 years from now!

To understand the current trends in the digital marketplace, why not read the Sogeti Studio OmniChannel Market Trends report – you can find it here.

AUTHOR: Rashid Ogunbambi

Posted in: Big data, Business Intelligence, Innovation, Internet of Things, Opinion, Technology Outlook      
Comments: 0
Tags: , , , , , , , , ,

 

  • What do you mean, more ‘technical’?

Traditionally, it seems that the IT industry has pigeon-holed testing as a non-technical activity, with testers praised for attributes such as curiosity, observation/ communication skills, ability to explain defects in simple language, social skills, and local domain knowledge.

Testing training such as ISTQB has always emphasised these attributes ahead of any kind of technical proficiency … but then the world went and changed.

  • So what changed?

The short answer is: Lots.

But let’s start with Agile. There is no requirement for Independent Testing or really any specific testing roles within Agile – which specifies multidisciplinary, multiskilled teams. Subsequently, project staff such as Developers, BAs and even Project Managers are doing their own testing now – non-professional testers without our tools, techniques, and lacking our inbuilt curiosity :).

Also with Agile, communication between teams is really fundamental, so testers increasingly need to have technical/ architectural/coding knowledge  to join in and keep up with these conversations.

Wouldn’t it be great if traditional ‘requirements-based’ testers added value to themselves by also having technical skills?

  • - Increased technicality
    Projects are becoming more “technical” every day, and with new technologies come new risks – it may be a revelation to some testers, but not all defects are to do with requirements. We must now target the defects in the stack rather than happen upon them whilst just generally doing functional testing stuff – which cries out for testers with technical skills and knowledge on top of more traditional testing backgrounds.
  • - Security
    Security testing is absolutely key to organisations now, and will only become more so. By its very nature, security testing tends to be very technical. As a tester, how can you test for security vulnerabilities with no knowledge of the underlying technologies in use? Or having no idea what traffic types go via specific ports? Or not having even an appreciation of the network architecture Security? Testing also tends to uses specialised tools (often very technical ones) and knowledge and experience in these is key, too.
  • - Mobile
    And then there is mobile. Multi-device, multi-platform, and an increasing need for cross-browser testing. Organisations using mobile for business critical applications, with new architectures, new tools, and news ways of distributing code – all come with challenges. Within this world, traditional testers – and ones with poor technical skills – sat waiting for signed-off requirements before undertaking only functional based tests, looks increasingly archaic.
  • - Cloud
    Other than testing business functionality, cloud testing often has a very technical aspect, including the ability to read and understand logs, having an understanding how to use stubs/drivers/ mocks. Cloud technologies also tend to need load testing as well as fairly thorough security testing – organisations just can’t afford to just lose data, and with an emphasis on embedding software in the cloud, and use of virtualisation, this testing can be technical in nature. Within this world, not appreciating the network architecture, or having any understanding of the technologies in use, just won’t wash.
  • - Automation
    The iterative nature of modern software development, particularly Agile, means more regression tests tend to be needed more frequently, and that demands the use of automation. To deliver this effectively, testers need experience and knowledge of automation tools, and often scripting languages such as JAVA/ VBA/ Python.
    There are a plethora of automation tools (many open source) and multiple scripting languages, plus many automation techniques and frameworks – the days of automation being solely HP QTP only are over.
  • - Data
    The world has gone data mad – businesses see massive potential in gathering and sifting our data, and how they collect and store information in data warehouses can be toublesom, requiring specialised testing. Testers are increasingly being asked to audit data trails, so require technical knowledge and use of new and innovative tools.

Why should I care?

Too many testers are still heard saying “I’m not a technical tester” – or “I have ISTQB Advanced Tech Analyst. I’m already a technical tester!” And too many Test Managers are becoming de-skilled – no more than mini PMs.

So why not embrace change and future proof yourself?

There is no compulsion to do any of this, if you are content testing requirements within a shrinking pond, or happy facing facts that there may not be a role for you in the new Agile world, just carry on. But projects are looking for testers who can add value by having these technical skills and, from a purely selfish point of view, testers should remember that employers are increasingly looking for testers who can action programme and technical knowledge transfers very well between domains!

Ok, I’m convinced, so how do I become a more technical tester?!

There isn’t an easy definition of technical testing, the first step is probably wanting to do it!

Not every project will support testing past the traditional requirements-based approach, and it is not unusual to find test environments locked down, draconian security policies in place preventing loading of none-approved tools, a non USB stick policy, no admin rights for testers on local machines, absolutely no running of scripts. And these are project/ domain decisions that must be respected, however increasingly it isn’t like that, and engagements are asking for creative approaches from all teams for them to add value to a project, and if so here is your chance.

Useful tips.

  • - Spend time and effort understanding the architecture of a project.
  • - Question the efficacy of just running scripts and queries that developers give you. Look to be truly independent, add value.
  • - Question the efficacy of only using the test tools that the project supplies. Wherever possible research and select your own test tools –why not use downtime to download and try out some new open source tools? Having a toolbag full of different tools that can be applied to different technologies is a positive thing you can do. Also learn automation tools such as Selenium/ JMeter and the scripting languages they use, as they are much in demand.
  • - Befriend technical people on a project, most architects and developers are only too pleased to explain the technical aspects of their work. Take copious notes, don’t be afraid to say you don’t understand, and take a list of key words to go look up in down time – this is how we learn.
  • - Ask about things like databases schemas, traffic to and from certain servers? Whether databases, say, are replicated, what type of traffic goes out certain ports?  Be ever curious – have a thirst for knowledge past just the system requirements.
  • - Take every opportunity to buddy with developers and existing technical testers, and look at the tools and techniques they use, and do similar.
  • - And finally, don’t stress about breaking stuff, you may have forgotten but first and foremost you are a tester!

 

AUTHOR: Jason Mudge

Posted in: Automation Testing, Big data, Cloud, communication, Security, Technical Testing, Technology Outlook, Testing and innovation, Working in software testing      
Comments: 0
Tags: , , , , , , , , , , , , ,

 

Ttest automationest automation is a promising way to increase the efficiency of repetitive testing tasks. However, it poses few challenges in ensuring its effectiveness and expected benefits.

When thinking about test automation, the first thing to remember is that testing includes much more than just a set of repetitive tasks. Therefore, not everything can (and should) be automated. For example, key testing tasks such as the definition of test objectives or the design of a test plan are engineering tasks that constitute the base for testing, regardless of whether the tests are manually or automatically executed. On the other hand, not all test case components or test types are needed to be repeatable, while others (loading tasks, regression tests, etc.) may really maximize the benefits of automation.

The decision on whether test cases are to be or not to be automated needs to be supported by the expected Return on Investment (ROI) analysis. It’s done by considering several aspects such as the effort for the creation of automated tests, the execution time, the feedback provided and the maintenance effort according to expected changes. In other words, we cannot limit the analysis to the conception that automation is a one-shot task, because obtained test case scripts need to be maintained.

It is known that in software engineering, the maintenance effort may be significantly greater than that required for development of new functionalities. We also know that maintainability relies on the ability to change/modify the implementation, and that this ability depends on the architecture, which organises the code and makes it more or less modular, changeable, understandable and robust. This is exactly what happens in automation: Not every automation approach will lead us to the same ROI. This is why we promote it in an automation project, where we need to align the objectives and the environment characteristics (frequency of software changes, data availability and integrity, etc.) with the definition of a suitable architecture, aimed at obtaining well-organised and structured test implementations to minimise the risks (maintainability effort, data variability, requirements changes).

This discussion poses a question: Is test automation simply a ‘recording & reproducing’ process or a more complex engineering process? There exist some tools that help us record scripts, which are able to interact with graphical user interfaces and reproduce the exact recorded interactions. However, what happens if something changes? Do we again record all the scripts or do we modify, script by script, the generated code in order to adapt it to the changes? And, if we modify the code (maintenance), would it be worth to have it implemented (with the assistance of recording to generate code chunks) based on a modular, understandable and maintainable architecture? The report – The Forrester Wave: Modern Application Functional Test Automation Tools, Q2 2015 – enforces this idea and states that “Fast feedback loops frequently break UI-only test suites”. Consequently, it suggests “Ratchet down UI test automation in favor of API test automation. Modern applications require a layered and decoupled architecture approach.”

Related Posts:

  1. Test automation – Don’t try to build Rome in a day
  2. Test levels? Test types? Test Varieties!
  3. Is it a good idea solving test environment problems “later”?
  4. Automation a.k.a “fail often, fail early”


Albert Tort AUTHOR: Albert Tort
Albert Tort is a Software Control & Testing specialist in Sogeti Spain. Previously, he served as a professor and researcher at the Services and Information Systems Engineering Department of the Universitat Politècnica de Catalunya-Barcelona Tech. As a member of the Information Modeling and Processing (MPI) research group, he focused his research on conceptual modeling, software engineering methodologies, OMG standards, knowledge management, requirements engineering, service science, semantic web and software quality assurance.

Posted in: Automation Testing, Behaviour Driven Development, Budgets, Business Intelligence, communication, Digital strategy, test data management, Test Driven Development, Test environment      
Comments: 0
Tags: , , ,

 

From GoT to the Enterprise

The continual increase in unstructured Big Data from the Internet of Things, the changeable requirements for developing successful mobile NoSQLapps and the trend for user-generated content are paving the way for NoSQL databases to prove their value. Relational databases will still be useful for managing more structured, uniform data sets, but they don’t possess the flexibility, agility, scalability and analytics capabilities of NoSQL database management systems. For example, multi-tenant applications such as popular games like Vainglory and Game of Thrones, require developers to make frequent feature updates and specific changes to individual characters for creating interesting capabilities to keep users interested. Similarly, as an increasing number of businesses undertake more Agile projects or undergo digital transformations, updates need to be made to enterprise apps much more quickly. The fixed schema in relational databases does not lend itself to this rapid process of continuous innovation.

 

 

Benefits of NoSQL

SQL vs noSQL

The main benefits that NoSQL databases bring to multi tenant applications are:

  • Faster iterations permitted by dynamic schema that accommodate real time and other unstructured data by enabling rapid changes and constant updates, without any adverse effect to the entire database
  • Faster and partially automated application management and maintenance, as areas of the database can be altered in isolation without causing changes elsewhere
  • Horizontal elastic scalability for peak loads, as opposed to the necessity for vertical scalability in SQL databases
  • High availability, reliable performance and better end user experience as NoSQL databases are built to serve predictable, low-latency requests
  • More cost effective than SQL as NoSQL only requires cheap commodity servers for effective operation

 

Health & Fitness

A good example of the benefits of a NoSQL database in action is Microsoft’s use of their own Azure DocumentDB to produce a scalable, schema-free and queryable distributed data store for MSN.com – their portal spanning 26 global markets with over half a billion monthly users. They initially rolled it out for the Health & Fitness section, which required scaling and authorisation capabilities to support 425 million unique users and 100 million direct authenticated users, creating a requirement for 20 TB of document storage. The write latency had to be under 15ms with 99% of read requests coming in at single digit latencies. The storage needed to be schema free and required rich query and transaction support with Hadoop based analytics, while the diverse set of vertical schemas required suitable data model extensions. MSN decided Azure DocumentDB was best placed to meet all their requirements and their applications were redirected and fully operational on the new database within just two months.

Several multi-tenant scaling solutions are available with Azure DocumentDB; and the type of project and desired outcomes will dictate which ones are right for you.

Sharding & Security

Here are some examples of the horizontal scaling or “sharding” and security capabilities:

  • Create multiple collections or reuse them for cost-effectiveness and choose to partition by collection or by database
  • Store tenants within a single collection for a simpler, more cost effective solution with transactional support across all data and then dynamically add collections as the application grows
  • When you partition across multiple collections in this way, you can also ensure that larger tenants have access to more resources by allocating them their own collection
  • Ensure security at the application level by adding a tenant property inside the document for tenant identification, using filters to retrieve data belonging to a particular sub-set of tenants and utilising authorisation keys to isolate tenant data and restrict access
  • Where you have a large set of users and permissions, you can place tenants across multiple databases to simplify the management process
  • Azure also allows you to partition tenants by database account, enabling isolation so that individuals can take responsibility for managing their own sets of collections and users
  • Various sharding options are available including, Range, Lookup and Hash partitioning for division by, for example, timestamp and geography, date by user / country / hash code for even distribution respectively.
  • Azure enables both read and write fan-out queries whereby the app queries /asks each of the partitions you have created in parallel and consolidates the results.

When considering which type of database to select, a great starting point is to look at your Customer Experience Management strategy and choose your solution based on the development speed, performance levels, security and scalability that best matches your customers’ needs and expectations.

To read the original post and add comments, please visit the SogetiLabs blog: Scalable Database Solutions for Multi Tenant Applications – The Rise of NoSQL

Related Posts:

  1. Top 10 post: “NoSQL: what’s in it for me?”
  2. NoSQL : What’s in it for me?
  3. NewSQL: what’s this?
  4. Our top 10 2014: NewSQL – what’s this?

Kevin Whitehorn AUTHOR: Kevin Whitehorn
Kevin is Head of Delivery for all Infrastructural and Developmental engagements with Sogeti clients in the UK. The engagements he looks after range from Desktop Transformation, Hybrid Cloud implementations and Application Portfolio Refreshes, to the introduction of fully Managed Services.

Posted in: Application Lifecycle Management, Azure, Big data, Digital, Enterprise Architecture, Innovation, IT strategy, Microsoft, Security      
Comments: 0
Tags: , , , , , , , , ,

 

API Economy Boom

The API economy is proving to be big business with more than 10,000 APIs published in the last 18 months (according to the world’s largest API repository, ProgrammableWeb). If you’re not already leveraging APIs as a provider or a user, you’re missing out on the potential to improve your services, increase your customer base and customer engagement, raise brand awareness and raise your company’s overall valuation. Netflix is a fine example of a company that has benefited in all these ways and is now receiving an astonishing 5 billion requests to its public APIs every day.

APIs are also the cornerstone of the Internet of Things, set to be a $19 trillion business by 2020 according to Cisco. APIs are driving innovation from the inside out by supplying access to intellectual property, goods, services and new ways of working, in turn offering a competitive advantage to those businesses savvy enough to make them public. No longer merely a development technique, APIs have been elevated to an integral part of application infrastructure and an important business driver. It’s a complete reversal for business leaders who are used to closely guarding the secrets of their success, so it requires a change in mindset and strategy in order to get it right. An API-specific, manual and automated test strategy with effective protocols for handling glitches is necessary to ensure that both your internal business users and external API customers are getting the best possible user experience.

Putting APIs to the Test

When considering your API test strategy, you need to consider who your consumers are and what they will need from your API. That helps you plan your API Quality needs and create the right test plans and environments. There are several types of API testing – once you’ve identified your consumer’s needs, you can decide how much of each type of testing you need to do. The kinds of API testing you should consider are:

Functional Testing

Functional testing can be in the form of unit testing, exploratory testing, automation or test cases. Your basic goal with this type of testing is to ensure that your API does what you say it will do and handles error conditions gracefully.

Load Testing

When you make your API available to partners or external developers, you significantly increase the load on your API. Any third party application using your API impacts the performance of your API… and if any of those applications goes viral… well, the load can be substantial. Your partners’ success can drive your failure! Load testing your API ensures that you can withstand the increased usage and helps you identify where and when to throttle the traffic.

Security Testing

APIs can increase your revenue stream, leverage your brand, and encourage partnerships. But they can also leave you vulnerable if you don’t take steps to protect yourself. Before embarking on a security strategy, analyse things like the attack surface your API exposes, the content and format of your payload, how secure your endpoints are, etc. Running a series of security scans against your API will help expose any vulnerabilities and give you a chance to correct them before anyone can compromise your data.

Virtualisation

Virtualisation isn’t really a type of testing but it is commonly used as a means to enable testing. Your partners need a way to exercise your APIs, load test their applications, run their own application security tests and create test automation frameworks. Isolating your production APIs from this activity will make them more stable and reliable for the applications using them. Creating mock services or virtual APIs for third parties to use as a sandbox will allow people to easily work with your APIs during the development phase.

Smarter than the Average Bear

API testing tools are still in their infancy but API specialist and Sogeti partner SmartBear offers some particularly good options, including functional testing, load testing, security testing and service virtualisation tools.

The API economy can fuel enormous growth and innovation, but with that innovation comes a responsibility for quality and security. Make sure that your APIs live up to your business goals by pushing them to their limits before pushing them to production.

AUTHOR: Lorinda Brandon
Lorinda Brandon is Director of API Partner Development at SmartBear Software, the leader in software quality tools for the connected world. She has worked in various management roles in the high tech industry including customer service, quality assurance and engineering. Lorinda has built and led numerous successful technical teams at various companies, including RR Donnelley, EMC, Kayak Software, Exit41 and Intuit, among others. She specializes in rejuvenating product management, quality assurance and engineering teams by re-organizing and expanding staff and refining processes used within organizations. Lorinda is a strong advocate for women in IT, promoting the importance of closing the gender gap for upcoming generations. Follow her on Twitter @lindybrandon.

Posted in: API, Business Intelligence, Innovation, Internet of Things, Publications, Virtualisation      
Comments: 0
Tags: , , , , , , , , , , , , , , ,

 

What’s Up Doc?

Digital globalization, the explosion of mobile technologies, increasingly complex supply chains, the BYOD trend and the tendency for even business people to use consumer caliber file sharing systems mean that the integrity and security of Info rights managementcorporate documents have never been more at risk. In a recent study, leading analyst organization Ovum discovered that majority of the organizations are relying on inadequate firewalls in a vain attempt to protect their sensitive information throughout its life span. This makes a business vulnerable to regulatory non-compliance and data leakage, with sensitive documents falling into the hands of competitors and costly mistakes being made when multiple versions of the same document are floating in the cyberspace. These are serious issues that will affect the reputation of your business and will challenge your clients’ loyalty to your brand.

This Message Will Self Destruct

Information Rights Management, the artillery of special document protection techniques, goes hand in hand with authentication, authoriz(s)ation, advanced multi-dimensional access control, auditing, encryption, electronic signatures, identity management and digital shredding. To meet the demands of new technologies, IRM has grown and the new generation of systems give you file-level control, auditing competencies and a more flexible and effective document protection strategy. Now, you can ensure that your intended privacy levels are integral to the document or file, the rights stay with the original ownership and your assets are properly secured within your corporate environment, in transit and beyond.

Today’s Secret Word Is…

Microsoft’s IRM solution is one of these super breeds and works alongside the easy-to-use interface of Office 365 Message Encryption for a comprehensive, simple-to-configure, user-friendly solution to all the potential challenges you face when your documents pass out of your control and into the hands of another team member or a partner. When you need an additional layer of protection, you can choose your own policy, for example by using the encryption feature and creating a named group of people who are the only ones with the power to make amendments. The IRM protection remains even when amendments are being made to the document in different locations. In addition, you can sync your protected documents to OneDrive for Business and syncing libraries, which was awkward and clunky before, is now simple to do. For document protection in the Cloud, Microsoft  Azure Rights Management (RMS) protects all file types in any environment, shares files securely by email, allows monitoring and auditing,  supports all commonly used devices (not just Windows computers) and supports B2B collaboration and on-premises services, in addition to providing excellent scalability and regulatory compliance.

Resistance is Futile

IRM is an important part of your Online Reputation Management Strategy (ORMS). When considering what provider to work with, ensure that the available features allow you to enforce your corporate policy, secure the safe passage of sensitive information, activate or end the life span of a document, restrict editing, viewing and forwarding, prevent screen printing and enable a full audit of your IRM. A sensible rule of thumb is to automate as much of the policy assignment as you can and enable users to add another layer of protection where necessary. It’s because this eliminates room for human error and forgetfulness. When IRM is this simple to implement, it would be foolish for us all to not make the most of this often-overlooked tool and reap the benefits of additional protection and enterprise-wide peace of mind that extends to your partners and clients.

Related posts:

Digital TransformationIt does not matter how big or small you are. Concepts of innovation are overwhelming your feeds on different social platforms. Over-hyped technology buzz-words and start-up (or whatever) approaches are knocking on our doors… but do they make sense?

An important thing to remember is to start with your company goals and apply technology to achieve these goals. Not the other way around. A lot of companies are adopting different types of technologies, but have no clue about what they should do with it.

Let us start from the beginning. We are living in a Digital world. Technology is everywhere around us. With Digital going mainstream, we will talk about Digital Transformation.

In a nutshell:

“Digital Transformation is the collective noun of the movement, which intertwine the physical and digital world to better determine client, customer and target audience needs to deliver excellent products & services with an excellent digital experience by the use of new technology.”

If you ask me, Digital Transformation is the process that companies follow when applying new technology to reach company goals. It’s about having a strategic mindset of what you want to achieve and then applying the fitting technology to get a step (or a few) closer.

Digital Transformation is a new way of developing / working / delivering for you and your company; but it improves those items on various aspects. Think about the possibilities for items like: Visibility, efficiency, effectiveness, value creation, speed, scalability, customer experience, product development, innovation and so on.

An important thing to remember is: Step out of all the rumors and think about your vision. What do you want to achieve? Then apply the right technology, which supports that vision.

Another important thing to note is: Be where your target audience is. Let your company be inspired by the way your customers use Digital solutions and facilitate them in using it. Make their lives easier by adding value to the reason of your existence, your products and services.

Of course, Digital Transformation is going to change the way you develop / work and deliver. Of course, your business model will change. But it will change your business model in a way to get closer to the people you serve. Is it a bad thing? No way. It has always been like this and will always be. The only thing that changes is the trend or technology we are talking about. In a few years, you will only remember the following:

“Digital Transformation: The paradigm shift towards business as usual”

To read the original post and add comments, please visit the SogetiLabs blog: Digital Transformation: Making sense of the mess we made with it

Related Posts:

  1. Quality-driven Digital Transformation
  2. How is an Architect’s role changing in the Digital Transformation era?
  3. There is No Room for Traditional Testing & QA in your Digital Transformation Journey
  4. Key themes for QA & Testing organizations to focus on during Digital Transformation


Rick Bouter AUTHOR: Rick Bouter
Rick Bouter is a Project Management Officer (PMO) since October 2013. Before this role he, graduated at the Sogeti trendlab VINT. In his final thesis Rick wrote a report on the trend the ‘Internet of Things’ and its impact on healthcare.

Posted in: Business Intelligence, Digital, Digital strategy, Innovation, Technology Outlook, Transformation      
Comments: 0
Tags: , , , , , , , , , , , ,

 

Design in Large Enterprises

Leading Digital CoverIn a study of more than 400 large enterprises, conducted by Capgemini and MIT for the book Leading Digital, authors George Westerman (MIT), Didier Bonnet (Capgemini), and Andrew McAfee (MIT) determined that business model innovation and digital transformation begins with designing compelling customer experiences from outside in. The same user-centric, design principles that startups apply, can be incorporated into innovation in large enterprises. But, this approach must be envisioned together with the necessary platform capabilities required to deliver the experiences.

The main difference from a startup’s approach to design for large enterprises is leveraging existing customers, assets, and channels in their business model for innovative design. The researchers termed large enterprises that excelled at design ‘Digital Masters.’

The Digital Masters approach design and usability by honing the following:

  • Understand how to affect customer behaviour by first identifying what they do and why, how they do it and where
  • Envision innovative ways to enhance experiences across channels
  • Increase usability and accessibility across channels to enhance reach and engagement
  • Incorporate customer data and analytics at the heart of superior experiences
  • Provide seamless digital and physical context into compelling integrated experiences
  • Excel at refactoring core operations for developing innovative platform capabilities to enable the superior customer experiences

The study reveals: When combined with effective leadership that understands the importance of design, business performance benefits of a good design can be compelling. As per business performance metric, design matters. The Digital Masters by far outperform their competitors with 26 percent more profitability. These enterprises generate 9 percent more revenue than companies that do not have a design focus. For the large enterprises with more than $500 million in revenue, included in the research for Leading Digital, this advantage can represent many millions of dollars to the bottom line.

Leading with Design and User Experience

Companies can no longer afford to ignore or gloss over the importance of design usability and compelling user experiences. They must recognise that value creation begins with designing; an act of spray painting at the end does not help. They must also realise that design is an investment that can be leveraged, because usage grows with an increasing user audience.

For a startup, a design-centric approach makes things happen faster. For a large enterprise, design is about how the company stays relevant to its customers. As additional modes of interactivity come to fruition via virtual reality, augmented reality, the Internet of Things, smart devices and innovative form factors such as wearable technology, design is the meansWearables 2to stay ahead of the curve in reaching new customers and unlocking value creation. With these new mediums for business model innovation, some design questions come to the forefront.

What does design look and feel like for virtual reality and augmented reality experiences? What does it mean to enhance a customer experience with IoT sensors and intelligent devices? How will the mobile application evolve in an increasingly connected world of intelligent platforms? How can natural language processing and artificial intelligence be incorporated into the design equation? How does platform engineering advance to support increased levels of scalability and distributed processing to support new modes of interactivity, richer contexts, and event-driven information delivery?

These are design challenges that present a significant opportunity for ventures in digital transformation and business model innovation to unlock market value. The answers to these questions may not necessarily come from large enterprises but from the lone designer making the next big thing in his/her garage. Design is a disruptor. A new era of design is about to begin. It is time to design for the future, and the future is NOW.

Read the first and the second articles of the series.

To read the original post and add comments, please visit the SogetiLabs blog: Designing for the Future … and the “Future” is NOW (Part 3)

Related Posts:

  1. Designing for the Future … and the “Future” is NOW (Part 2)
  2. Designing for the Future … and the “Future” is NOW (Part 1)
  3. Designing Your Disruption Bridge towards a Stable Normal
  4. Putting the customer first

 

Sergio Compean AUTHOR: Sergio Compean
Sergio Compean comes to Sogeti USA with extensive technology consulting and leadership experience in the areas of distributed systems software engineering and enterprise solutions. Sergio has been successful in building culture of innovation and entrepreneurship to develop high performing teams that deliver significant value to clients across market segments and project portfolios. Sergio has a consistent track record for delivering high-touch client services with deep insights to realize positive outcomes from business strategy and technology vision. In addition to participating in go-to-market initiatives and influencing application platform strategy across R+D, sales and execution, Sergio has collaborated with executive management teams and strategic channel partners to achieve significant market development and increased revenue streams. His thought leadership work has been included in Gartner industry analyst presentations and organizational technology readiness initiatives. Sergio’s leading edge work has also been featured by the Microsoft Platform Architecture Group at Microsoft global conferences. He has served on industry standards organizations such as WS-I (now part of OASIS) developing services interoperability specifications. Sergio has produced highly-rated webinars, blog articles, client seminars, and publications covering advanced topics on emerging technologies like Windows Azure. Sergio was the founder of Connected Systems Group, part of the Haiti Rewired initiative, chartered with applying a systems thinking approach to delivering aid to Haiti after a devastating earthquake in 2010. He led the effort for developing ways to deploy mobile technology and cloud services to define a roadmap for economic recovery. Some of his publications related to this work include SWARM – Twitter Messaging Metadata Language for Disaster and Crisis Management, Empowering the New Haiti with Cloud Computing Technology, and Rewiring the Haiti Job Market with Mobile Crowdsourcing. Sergio is an Electrical and Computer Systems Engineering graduate from Rice University. He has also completed advanced courses on machine learning, artificial intelligence and HCI taught by industry thought leaders and professors at Stanford University. Sergio is a voracious reader of business strategy and design books, loves painting in acrylic, and enjoys riding his mountain bike on beautiful sunny days. He is the author of the whitepaper entitled Enterprise Architecture for Business Model Innovation in a Connected Economy.

Posted in: Business Intelligence, Developers, Digital, e-Commerce, Innovation, Internet of Things, IT strategy, Research, Technology Outlook, Usability Testing, User Experience, User Interface, Virtualisation, wearable technoloy      
Comments: 0
Tags: , , , , , , , , , , , , , ,

 

UX+X=Business Model Innovation

As illustrated with Apple’s iTunes business model innovation, and now potentially with Apple Music, user experiences enabled by highly-scalable platform services and easy-to-use mobile devices, can unlock a significant 3D Process Flow Diagram - resizedamount of value. The mobile inflection point is well passed, and entrepreneurial ventures are realizing the importance of design in value delivery mechanisms. In her landmark Internet Trends Report in 2004 , Mary Meeker, partner at leading venture capital firm Kleiner Perkins Caufield & Byers (KPCB), identified that usage growth was tied to the user experience. Mary focuses on investments in the firm’s digital practice and helps lead KPCB’s Digital Growth Funds, targeting high-growth Internet companies that have achieved rapid adoption and scale. In her 2014 Internet Trends Report, Mary included a slide entitled, “R.I.P. Bad User Interfaces,” underscoring the importance of a quality user interface as the next key to growth.

To begin understanding the huge shift in value creation driven by user experiences, consider the following:

  • Uber, the world’s largest taxi company, owns no vehicles.
  • Facebook, the world’s largest popular media owner, creates no content (at least not yet).
  • Alibaba, the most valuable retailer, has no inventory.
  • Airbnb, the world’s largest accommodation provider, owns no real estate.

All of the above enterprises focus on designing superior, highly-contextualized mobile user experiences to drive growth. The main assets of these enterprises are not inventories but platforms – platforms for business model innovation that deliver compelling user experiences.

Design in Startup Ventures

Startup ventures, founded or led by designers, are being acquired by big corporations at an increasing rate. Investors realize that these designers understand the usability challenges involved in delivering better user experiences given the constraints of their go-to-market sectors and device form factors. In 2006, Google acquired YouTube, co-founded by Chad Burely who studied design, for $1.65 billion. Since 2010, companies like Google, Adobe, LinkedIn, Dropbox, Yahoo, and Facebook have acquired twenty-seven startups that were co-founded by designers.

People Working in a Conference and Photo Illustrations-resizedIn a study of the top 110 designers in the technology industry, 27% of those surveyed were working in early stage startups where capital raised was less than $10 million. These designers reported that the ratio between designers and engineers was somewhere between 1:4 and 1:5. This is a striking development, considering that the ratio used to be 1:15, or even 1:30, according to talent experts in the startup sector of the economy. Design is being baked into the startup DNA.

More business schools that feed talent into startups are also focusing on design as an imperative. Eight schools out of the 2015 Financial Times Top Ten Global Business Schools list, eight schools have student-led, design clubs and/or design partnerships.  These business schools include:

  • Stanford (d.school)
  • MIT (Sloan)
  • Harvard
  • Berkeley (Haas)
  • Chicago (Booth)
  • Pennsylvania (Wharton)
  • INSEAD
  • IESE Business School

Venture capital firms in Silicon Valley have been responding to the increasing importance of design in value creation at startup ventures.  Realising the value that the design-led startups have been creating, venture capital firms are making more funding available to these enterprises. In one study, five startups (co-founded by designers) were identified to have raised $2.5 billion. For the venture capital community, design is not about aesthetics, but about relevance and the evolution of business value in a world where innovation is accelerating and continually transforming.

To read the original post and add comments, please visit the SogetiLabs blog: Designing for the Future … and the “Future” is NOW (Part 2)

Related Posts:

  1. Designing for the Future … and the “Future” is NOW (Part 1)
  2. Designing Gamification That Makes Sense
  3. 5 on Friday: The new Apple, Google Glass x Guns, Robots and the Future of A.I.
  4. Future of Retail: Nightmare on Elm Street

Sergio Compean AUTHOR: Sergio Compean
Sergio Compean comes to Sogeti USA with extensive technology consulting and leadership experience in the areas of distributed systems software engineering and enterprise solutions. Sergio has been successful in building culture of innovation and entrepreneurship to develop high performing teams that deliver significant value to clients across market segments and project portfolios. Sergio has a consistent track record for delivering high-touch client services with deep insights to realize positive outcomes from business strategy and technology vision. In addition to participating in go-to-market initiatives and influencing application platform strategy across R+D, sales and execution, Sergio has collaborated with executive management teams and strategic channel partners to achieve significant market development and increased revenue streams. His thought leadership work has been included in Gartner industry analyst presentations and organizational technology readiness initiatives. Sergio’s leading edge work has also been featured by the Microsoft Platform Architecture Group at Microsoft global conferences. He has served on industry standards organizations such as WS-I (now part of OASIS) developing services interoperability specifications. Sergio has produced highly-rated webinars, blog articles, client seminars, and publications covering advanced topics on emerging technologies like Windows Azure. Sergio was the founder of Connected Systems Group, part of the Haiti Rewired initiative, chartered with applying a systems thinking approach to delivering aid to Haiti after a devastating earthquake in 2010. He led the effort for developing ways to deploy mobile technology and cloud services to define a roadmap for economic recovery. Some of his publications related to this work include SWARM – Twitter Messaging Metadata Language for Disaster and Crisis Management, Empowering the New Haiti with Cloud Computing Technology, and Rewiring the Haiti Job Market with Mobile Crowdsourcing. Sergio is an Electrical and Computer Systems Engineering graduate from Rice University. He has also completed advanced courses on machine learning, artificial intelligence and HCI taught by industry thought leaders and professors at Stanford University. Sergio is a voracious reader of business strategy and design books, loves painting in acrylic, and enjoys riding his mountain bike on beautiful sunny days. He is the author of the whitepaper entitled Enterprise Architecture for Business Model Innovation in a Connected Economy.

Posted in: Behaviour Driven Development, Big data, Business Intelligence, communication, e-Commerce, Human Interaction Testing, Innovation, mobile applications, Open Innovation, Opinion, Testing and innovation, Usability Testing, User Experience      
Comments: 0
Tags: , , , , , , , , , , , , , ,

 

The Fintech Forerunner

Fintech has officially exploded into the UK banking sector, with UK Trade & Investment estimating investment in UK fintech companies at more than £342 million in 2014. Our relatively straightforward regulatory system (just compare it to the state to state variations in the USA for example) and the fact that the fintech companies in our capital are located close to the investment community, are just 2 of the factors that have helped the UK become the forerunner of the fintech revolution.

However, fintech is a double-edged sword for traditional banks. On one hand the digital disruption it brings has the potential to make them less relevant and less critical to modern banking. On the other hand it could facilitate their ability to create more desirable, accessible and cost effective B2B and B2C services, should they choose to leverage fintech innovations as a competitive differentiator. An additional factor that gives fintech startups the advantage over banks is that they only become subject to regulations once they have honed their business model and are ready to offer their services to potential customers, whereas the banks are already subject to the red tape.

FinTech Future

These facts highlight two major factors that are changing the face of testing in the banking sector. Firstly, the customer is now being offered an ever increasing choice of banking solutions from a huge range of providers.  Secondly, as explained in the UK Government Chief Scientific Adviser’s report: FinTech Futures – The UK as a World Leader in Financial Technologies, the glut of start-ups entering the banking sector increases the risk of questionable business practices, exploitation and fraud. For the customer, this creates a need to educate themselves about the fintech services on offer so they are better equipped to spot and avoid scams and select the best option. For banking organisations, it means that ensuring their applications are high quality, user friendly and provide a great customer experience is the only way to remain in the game.

The upshot of all this is that it becomes critically important to devise an end-to-end testing methodology incorporating the full range of testing techniques, including test coverage of business requirements and workflows, performance and functionality, data integrity and of course concurrency. It also becomes of paramount importance to test much earlier on in the development lifecycle and roll out upgrades and new releases more frequently, admittedly with less functionality per iteration, but crucially with fewer errors and glitches for a better customer experience.

According to a recent report 53% of testing professionals in USA, UK, Ireland and the Middle East cited the pressure of speedy upgrades and new releases as the primary motivator for improving testing capabilities. Historically the banking sector has relied heavily on fixing post-event defects during the production phase, so it’s heartening that 59% of the respondents were keen to improve testing processes in their organisation.

Overcoming the Barriers

When it comes to perceived barriers to earlier, more comprehensive testing, we are up against the usual suspects and excuses that we see across the majority of business sectors:

  • - Earlier testing is not a guarantee that the end result will be free from glitches
  • - End-to-end testing increases time to market
  • - Many banks and financial services organisations currently test in house and simply don’t have the resources
  • - The value of testing is difficult to quantify and the efficacy of the current test process can also be difficult to determine

Taking all of the above factors into account, particularly the heightened focus on the end user, one way forward for fintech testing is Acceptance Test Driven Development, (ATTD) whereby the customer is brought in to collaborate on the testing process, so we can understand their expectations and behaviour before we even start to write the code. This process also brings the business experts, programmers and testers together which will hugely improve the end results, rather than testing taking place in a virtual vacuum.

As banks generally have limited testing resources in house, (52% have less than 5 test personnel: source), the need for a stronger testing strategy will also increase the importance of working with an outsourced testing partner. Although this new type of strategy goes against traditional testing in banking, making major changes is the only way for banks to compete and in any the benefits of ATTD, such as higher quality applications, less down time and satisfied customers speak for themselves.

If you would like to find out how Sogeti can support your testing goals, please get in contact us: enquiries.uk@sogeti.com or call on +44 020 7014 8900.

AUTHOR: Sogeti UK Marketing team

Posted in: Big data, Business Intelligence, Developers, Quality Assurance, Technology Outlook      
Comments: 0
Tags: , , , , , , , , , ,