SOGETI UK BLOG

Tester – The Private Detective Of Software Development

In my earlier� post, I have talked about the characteristics a good tester should have, such as being analytical and curious, being a good communicator, having the ability to put himself in the users’ shoes. But lately, I am starting to think that the most important characteristic that defines a tester is being inquisitive: investigate and ask perseveringly until he/ she gets the information he/ she needs to test the application and to know if it has the expected quality. Something like a cop or a private detective who searches for clues and interrogates the witnesses to find out who committed the crime, how and why.

More than once we have to test applications without having enough functional or technical documentation that explains us how they must work. How can we say if the application works as expected if we don’t know what we have to expect? How do we know what is the expected result when we sum 1 + 1 if no one has told us it must be 2? That’s the moment when we have to turn on our detective skills and start interrogating the witnesses, that is, the business analyst, the functional analyst and, sometimes, even the developer. Ask, ask and ask again until you find the solution or find achievements of the culprit:

– What formula is applied to calculate the cost of the product?
– What must be the answer of the application if a series of steps are followed?
– If an error happens, how does the application respond?
– What fields are mandatory to create a new register?
– What filters have to be established to do a search in a certain screen?
– What security profiles must the user have to accomplish a certain action?

However, in most cases, this is not enough. As Grissom said in CSI: “People lie, but evidence doesn’t. That’s why we have to look for fingerprints or DNA samples to find the killer. And which clues can help us to resolve the crime?” For example, we can search for old documents that make reference to the application under test, make queries in the database or access applications that interact with the one we are testing, among other options.

In short, to become better testers and to manage that the application under test has the greatest quality possible, we have to grab our magnifying glass, start thinking as a detective would do in the middle of a police investigation and to do everything we can to prevent that the crime remains unsolved.

AUTHOR:
Paloma Rodriguez has been Test Engineer for Sogeti Group since 2011. In this role, she manages testing projects and participates in various publications and training activities.

Posted in: Behaviour Driven Development, Business Intelligence, Developers, Digital strategy, functional testing      
Comments: 0
Tags: , , ,

 

 

My neighbour Margit is a teacher in a local school. A couple of days ago, we met in the front yard and chatted about getting started after the summer break, and she told me that she was getting introduced to the concept of challenge based learning. Curious as I am, I had to search for that on the web afterwards to find out more.

As described on www.digitalpromise.org , Challenge Based Learning (CBL) provides an efficient and effective framework for learning while solving real-world challenges. The framework is collaborative and hands-on, asking all participants (students, teachers, families, and community members) to identify big ideas, ask good questions, identify and solve challenges, gain deep subject area knowledge, develop 21st century skills, and share their experience with the world.

The framework is divided into three stages:

  • Engage: Identify the big idea and start generating a large variety of essential questions that reflect interest and needs of the community. Eventually, ending up with the definition of a challenge, based on one of the essential questions.
  • Investigate: Based on the challenge the learner conducts content and context-based research to create a foundation for a solution. This beginning with identifying a number of guiding questions related to the challenge – things we need to learn in order to find the right solution for the challenge. Use the questions to identify activities and resources necessary to answer them. This can be needed for research, simulations, games, questionnaires etc. As a part of the investigation, we also align questions and activities to any given standards – in school terms, this could be connected to subject areas like science, math, arts etc. Eventually, when all questions have been answered and all activities have been conducted we analyze the data and identify themes, developing a clear conclusion as a foundation for the solution.
  • Act: Here the evidence- based solution is developed and implemented and the result of the implementation is evaluated. “Act” is split into two different parts; solution development where the foundation from the investigation phase is used to develop a solution in an iterative cycle using prototypes, experiments and tests. After development we implement and evaluate the outcome – reflecting on what worked and what didn’t. When the implementation is complete we can either reiterate on the solution or create a completion report sharing the work with the rest of the world.

Reading about the model I had two thoughts: “This sounds like a kind of iterative development model taken out of the IT context”, and “could we maybe get inspiration from this to plan and execute testing in an agile environment?”

If we dive into the latter thought – maybe, we could imagine a situation where the team uses the concept of challenge based learning as a way to identify and plan test for the team as well as executing the exploratory charters and collect the knowledge gained by that to communicate the quality of the system under test?

Let’s take a look at the steps described by the framework and try to put words on what it would mean in a testing context.

Engage:

Big ideas: When you start a project you need to start by defining the purpose of the system the team is going to build. What value should it give to the customer. I think it is a crucial knowledge no matter whether you are a tester or test manager- you need to understand why we are building this system.

Essential questioning. For me, this could be two different activities on two different “levels”. It could be equivalent to the product risk analysis – in whatever shape or form you choose to do it. Brainstorm in the team to get a common understanding. But it could also be on epic or story level you use this – brainstorm over questions for the epic to get a better understanding and clear acceptance criteria.

Challenges. This is where you define your “test project” – collecting the information from PRA, questions etc. and define the scope of the test, the test assignment. How to document it is up to your context, whether it is visualized in a drawing, written in a wiki or in a IEEE829 formatted document.

Investigate:

When it comes to the investigate part of the model, I could see that on two different levels – planning test and also preparing an exploratory test session. I will try to describe both in the following.

Guiding questions. From a planning perspective, the guiding questions could be used as a tool for the test manager; asking a number of questions to clarify scope, risk and test approach for the project. All questions needed to learn about the project and to do the most efficient testing. On the other hand, from a tester perspective, I would use the guiding questions as a way to identify all I need to know and understand in order to test a feature the best possible way.

Guiding activities and resources. If we again see this from two different perspectives, then this activity would be used by the test manager and team to investigate options for the test strategy; automation framework, virtualization of test environment etc. And I don’t necessarily mean documented IAW IEEE829 or TMap, but a visualization of what the strategy and plan for testing this system is. Create a common understanding of the testing challenge at hand. From the testers perspective, this would be where I started identifying potential themes or areas of interest – where do we need to focus, how should we address the testing for this feature?

Curriculum and standard alignment. Do we need to fulfil any standards? FDA regulations? Military standards etc? If so we need to ensure that we check for compliance in accordance with the requirements. This is more in the sense of checking than testing as such, but is a necessary activity as well.

Analysis. Based on the activities and questions we can now specify the test strategy, and as mentioned before this could be in whatever way that fits your context; drawing, wiki, document etc. The tester is now ready to identify the charters and sessions that he knows he needs to do in order to fulfil the testing challenge – the strategy for the project. Based on the collected knowledge and challenges he can break the challenge down into separate sessions. For large systems there might be a need to do this in the team together, facilitated by the test manager.

Act:

When it comes to the acting part I see a good mapping session-based exploratory testing.

The solution development is when the testers are conducting the sessions, this is where prototypes are created, experiments are conducted and an iterative design cycle is being executed. That cycle would then be the cycle of simultaneous learning, test design and test execution as defined by James Bach. This is where the knowledge we gained by questioning and challenging is giving payback – in the sense of finding bugs, learning about the system and questioning the solution.

From a test management perspective, this is where we follow up, mitigate risks, support the team. There might be activities to improve the environment, introduce new tools based on experimentation etc.

Implementation and evaluation are the de-briefing of the sessions. This is where we reflect over the session just completed; evaluate the outcome (the PROF mnemonic – Past, results, obstacles, Outlook, feelings. See satisfice.com for details) and evaluate any needs for supplementing with new charters etc. This is also where the test manager prepares any reporting he needs to do for the project as a whole – so again two different perspectives. When it comes to the reflective part, the test manager might revisit both set of questions used earlier in the project, both during engage and investigate to see if the answers were found and the test assignment met.

What I Learned?

What do I like about the above? Actually most of all the questions; using questioning deliberately and in a structured manner to learn and gain information as a foundation for test planning and test specification/execution. I regularly see test managers sit in an ivory tower when specifying their test strategy, only involving others on a very limited scale. And often we have to base our test strategy on a limited amount of documentation, but are missing a good approach to collect the information needed – maybe this could be a way? With more and more projects being agile, maybe this challenge based learning concept could be used as a collaborative way to create a team owned test strategy?

The challenge based learning method actually operates with a three question approach; this could add value for the test manager in his planning and control of the test in the project? Maybe by using them like this:

  1. First level of questions to understand the assignment at hand (Engage – essential questioning)
  2. Second level of questions to understand what the approach needs to be in order to support the assignment (Investigate – guiding questions)
  3. Third level of questions to evaluate: did we indeed fulfil the assignment (Act – evaluation)

Could there maybe even be groups of questions that we could reuse as a foundation for inspiration – at least for the two first levels of questions? One could maybe see them as heuristics of test planning, or maybe even as question patterns – q-patterns as originally described by Vipul Kocher (e.g. at EUROSTAR 2008: http://www.slideshare.net/EuroSTARConference/vipul-kocher-software-testing-a-framework-based-approach):

A set of inter-related QUESTIONS grouped together to:
• Relate to some aspect of user or software requirements
• Provide various alternatives to arrive at a solution

There are questions that we repeat every time we start a new project, why not gather these and get that head start when engaging in a new project, investigating the challenge at hand? Of course, the context dictates a number of things that will differ from time to time, but other questions are always relevant.

Conclusion:

Well, this is all good and well, but what did it change that I tried to do this mapping between challenged-based learning and agile testing? I didn’t revolutionize the world of testing that is true… but I think it is a small evolution in the sense of a structured yet flexible and context driven approach to test planning and execution. And just the process of stopping, looking at others models, trying to see how my world fits in is of value – continuous learning and improving my own world of testing.

(source www.digitalpromise.org)

Gitte Ottosen AUTHOR:
Gitte Ottosen has 19 years of experience in test engineering and test management, is a certified ScrumMaster, TMap Test engineer, CAT trainer and holds an ISEB Practitioner Certificate in software testing.

Posted in: Behaviour Driven Development, Big data, Business Intelligence, Digital strategy, IT strategy, Managed Testing, Virtualisation      
Comments: 0
Tags: , , , , ,

 

llExpedia reveals the holiday of the future:

Online technology driven by Social Media will be the driving force for inspiration when choosing a holiday

Travel providers will use virtual (VR) and augmented reality to captivate travellers before booking a holiday

Expedia’s Facebook Booking Bot heralds the advent of machine learning in the travel industry

Ever wondered what your future holiday could be like? The Holiday of the Future Report by Expedia.co.uk provides new insight into how technology will transform the experiences of future travellers.

The report reveals that travel companies are increasingly experimenting with 360-degree tours, VR technology and even Facebook Booking Bots to interact with customers, as Millennials rely more and more on technology. Social media will also be an integral part of the entire travel journey, from the consideration phase, all the way to returning and sharing the entire experience. Alex Platts, Commercial Director, Brand Expedia Northern Europe said: “Through Social Media, the generations of today feel much deeper connections with the wider world. Regardless of age, they have a strong desire to enrich their lives through travel and a genuine need of sharing every experience online in new ways.”

Expedia also predicts the booking process will be completely transformed, through immersive virtual and augmented reality technologies, which will allow travellers to have a deeper understanding of what a destination ‘feels like’ before making a booking. Rachael Power from Virtual Reality News, who contributed to the piece, explains how for the first time the holiday of the future will allow people to ‘try before they buy’. She said: “Virtual reality has the potential to radically change travel. People can ‘try before they buy’ by visiting locations in VR, from the UAE to the Irish Wild Atlantic Way. It also shakes things up for those who are immobile or on smaller budgets; the view from Machu Picchu is now simply the cost of a Google Cardboard headset away.”

Expedia envisions a future where content shared on social media will evolve from the standard photo into rich-media experiences. Virtual reality and 360° film technology will begin to reach critical mass once brands invest in content for this medium. Tourism Australia recently partnered with Expedia to launch an innovative brand campaign using 360° video called ‘How Far’. The series of films was created using the latest technology and highlights how travel can transform you and shape your view of the world. In addition to VR, the report also looks at developments in transportation, airport security and hospitality, all designed to deliver more personalized and seamless travelling experiences. This will happen sooner than expected, as self-driving transportation and smartphone based airport and hotel check-in technologies are already being tested.

Travel experts also anticipate the disappearance of manned check-in front desks and the emergence of holographic concierge features, digitised butlers or ‘smart mirrors’ designed to provide bespoke services according to each guest’s unique needs and wants.

It seems the future of travel isn’t that far away in the future.

AUTHOR:

Posted in: Innovation, Social media, Social media analytics, Technology Outlook, User Experience, Virtualisation      
Comments: 0
Tags: , , , , , ,

 

cloud_LOWRES

We know that cloud computing is “the new normal” just like virtualization was in the past. And we also know that the adoption of cloud computing by your organization can come with a series of benefits including:

  1. Reduced IT costs: You can reduce both CAPEX and OPEX when moving to the cloud.
  2. Scalability: In this fast changing world it is important to be able to scale up or down your solutions depending on the situation and your needs without having to purchase or install hardware or upgrades all by yourself.
  3. Business continuity: when you store data in the cloud, you ensure it is backed-up and protected which in turn helps with your continuity plan cause in the event of a crisis you’ll be able to minimize any downtime and loss of productivity.
  4. Collaboration: Cloud services allow you share files and communicate with employees and third-parties in this highly globalized world and in a timely manner.
  5. Flexibility: Cloud computing allows employees to be more flexible in their work practices cause it’s simpler to access data from home or virtually any place with an internet connection.
  6. Automatic updates: When consuming SaaS you’ll be using the latest version of the product avoiding the pain and expensive costs associated with software or hardware upgrades.

But once you ask yourself: what can possibly go wrong? You open your eyes to a “cloudy weather” where you must plan, identify, analyze, manage and control the risks associated with moving your data and operations to the cloud.

To help you with the identification process, here is a list of risks that your organization can face once you start or continue the transition to the cloud:

  1. Privacy agreement and service level agreement: You must understand the responsibilities of your cloud provider, as well as your own obligations. In some situations, is your obligation to do configure correctly the service in order to enable the best SLA possible.
  2. Regulatory compliance: Remember that although your data is residing on a provider’s cloud, you are still accountable to your customers for any security and integrity issues that may affect your data and therefore you must know the standards and procedures your provider has in place to help you mitigate your risk.
  3. Location of data: Know the location of your data and which privacy and security laws will apply to it cause it’s possible that your organization’s rights may get marginalized.
  4. Data privacy and security: Once you host confidential data in the cloud you are transferring a considerable amount of your control over data security to the provider. Ask who has access to your sensitive data and what physical and logical controls does the provider use to protect your information.
  5. Data availability and business continuity: How is your organization and the provider prepared to deal with a possible loss of internet connectivity? Weigh your tolerance level for unavailability of your data and services against the uptime SLA.
  6. Data loss and recovery: In a disaster scenario, how is your provider going to recover your data and how long will it take? Be sure to know your cloud provider’s disaster recovery capabilities and if and how they have been tested.
  7. Record retention requirements: If your business is subject to record retention requirements, how well is the cloud provider prepared to suite your needs?
  8. Environmental security: Cloud computing data centers are environments with a huge concentration of computing power, data, and users, which in turn creates a greater attack surface for bots, malware, brute force attacks, etc. Ask: how well prepared is the provider to protect your assets through access controls, vulnerability assessment, and patch and configuration management controls?
  9. Provider lockdown: What is your exit strategy in case your provider can no longer meet your requirements? Can you move your data and operations to another provider’s cloud? Are there technical issues associated with such a change?

Remember we are talking about your data and business here and once you transition to the cloud you are still accountable and responsible for what happens with it. And yes, moving to the cloud comes with a series of benefits and rewards if the associated risks are identified and well managed.

References:
http://www.cio.com/article/2409109/cloud-computing/risk-management-in-cloud-computing.html
https://www.business.qld.gov.au/business/running/technology-for-business/cloud-computing-business/cloud-computing-risks

AUTHOR:

Posted in: Cloud, Data structure, Digital strategy, Innovation, privacy, Quality Assurance, Research, Security, Software Development, Technical Testing, Virtualisation      
Comments: 0
Tags: , , , , ,

 

Details of medical laboratory, scientist hands using microscope for chemistry test samples

Big data / NoSQL Cassandra / SOLR / Natural Language Processing – Text Mining for Pre-screeening of Cancer Clinical Trials.

Cancer clinical trials are search studies that test the pertinence of a new medical treatment on cancer patients. They are key factors for medical improvement and their success depends essentially on the number of enrollments onto trials.

Pre-screening patients manually require lengthy investigations and successive matching on patients’ records during a limited phase.

Adding to this, a large amount of money spent for this phase, automating the eligibility prescreening process turns out a promising and a beneficial solution for cancer treatment.

In fact, automating this process remains an information retrieval task. Medical records, which are mainly originated from surgical pathology laboratory, constitute a rich source of unstructured data. They are written in a natural/human language which is complex and difficult for a machine to process. Dealing with such type of data requires a structuring phase for extracting useful information in order to provide the necessary knowledge to the machine, thus for translating the human language to a machine recognizable language.

Text Mining and Natural Language Processing (NLP) combined together, constitute a solid solution for representing this valuable information stored on medical records. They deal both with free text, and the main objective is to extract non-trivial knowledge from it. It encompasses everything from information retrieval to terminology extraction, text classification to spelling correction and sentiment analysis. NLP methods rely intensely on probability theory, statistics and machine learning field. It deals also with linguistics concepts, grammatical structure and the lexicon of words.

Recently, cancer research is benefiting from the Text Mining advancement and uses its theory for clinical decisions. More precisely, automating cancer clinical matching trials have been the subject for many studies and solutions which dealt with information retrieval from medical records. In fact, working with cancer data consists of covering hundreds of cancer diseases with a very large lexicon. Many medical terminologies have been constructed in order to regroup medical concepts and thus to provide a unified lexicon in the field of medical. Those libraries, mainly UMLS, SNOMED and CIMO, are a major component of Natural Language systems designed for medical field. They serve as a link between patient data and the Text Mining system in order to enrich clinical records and extract synonyms for medical concepts.

OVERVIEW OF OUR ALGORITHM

To get a clear view of how Text Mining and NLP can help the automation of clinical trials, let’s get in deep of the most used methods for processing natural language stored in clinical data. Firstly, we recall that the objective is to extract medical concepts and semantic types from both the clinical trial criteria datasets and patient data. NLP provides a semantic representation of the natural language sentences in order to map them to their original meaning. It uses either a rule-based algorithms or the machine learning paradigm for more complex language processing.

Most of the automated patients prescreening systems are rule-based. They are easy, fast and more preferred to deploy. Such methods perform well on simple types of information, but for complex type of data ML algorithms, although being a Black Box for clinicians, are more robust and give good performance. Rule-based models are mainly used for medical text pre-processing: tokenization, sentence parsing, redundancy removal, etc. After pre-processing the free text, an assertion detection phase is followed in order to detect negation. NLP system tries also to detect medical terms using different medical terminologies. The other approach is to use Machine Learning models for the same purpose through the analysis of a set of documents or individual sentences that have been hand annotated with the correct values to be learned. Main ML algorithms that are used for NLP are Naïve Bayes, Support Vector Machine and Random Forest… They take as an input a large set of features induced from patient’s records and try to learn rules from the annotated examples. ML methodology can also be used for learning information from the previously selected patients’ data by detecting features that explain the enrollments into previous clinical trials.

After retrieving all useful information from the unstructured data and expanding it with all possible medical hyponyms from medical ontologies, it serves as an information retrieval data source for matching patients with inclusion and exclusion clinical trials criteria. Given a cancer clinical trial and the encounter patients, the Text Mining system supplies to clinicians a restricted list of eligible patients, thus providing a significant impact in reduction of time and effort for manual pre-screening.

BIG DATA ARCHITECTURE

Due to the large volume of data to be managed, we have selected and designed a big data architecture based on Datastax. Why Datastax? Because it supports Hadoop, Spark, Cassandra and SOLR. Already ready to use. So, deploying it using MS AZURE portal, it took around 1 hour to get several nodes working and ready to use.

We imported all the data into a Cassandra database, then SOLR indexed it and we were able to perform some data exploration and search quickly.

We add synonyms coming from SNOMED and UMLS in order to be able to use synonyms search feature of SOLR. Thanks to dedicated NLP developments in PYTHON we implemented Natural Language Processing features (negation, semantic improvements, medical terms identification, stemming, etc) in order to improve prescreening process performance.

TEST PHASE IN PROGRESS

By the end of 2016, we will complete the test phase and we will add some improvements taking in account users feedback.

A new post will be published then, with the final conclusions and results.

CONCLUSION

Taking benefits of all scientific articles we were able to design a cancer clinical trials prescreening solution in French. Several products exist in English but no solution is available for France or French-speaking countries.

Business benefits offers by our solution is already obvious: by suggesting a list of patients in a few minutes to clinical trials team instead of several days of manual screening, the team can focus to confirm results proposed instead of screening tons of patients records and data.

Contributor: Bilal AZENNOUD, Data Scientist, SOGETI France

AUTHOR:

Posted in: Automation Testing, Big data, Biology, Data structure, Innovation, Quality Assurance, Requirements, Research, Socio-technical systems, Software Development, Testing and innovation      
Comments: 0
Tags: , , , , , ,