SOGETI UK BLOG

For many years the generally accepted approach has been load testing, with the general intent of determining that capacity planning has been sufficient. When the dynamic web content is largely generated server side, using JavaScript primarily to show and hide content this was enough. It gave us a general picture of performance and life was good.

Most of the new web development I see utilizes some form of MV* JavaScript framework. This creates a whole new world for performance testing. We are no longer serving up dynamic content. We are now serving up data, usually in JSON format, although the format really doesn’t matter. What matters is that much of the processing that used to take place on the server now happens in the browser. With the Single Page Application (SPA) pattern the network traffic generally becomes more chatty, but very little traffic is actually going across the network.

Does the traditional performance test still meet our needs? The simple answer is “in part”. I say in part because we still need to make sure we are insuring sufficient server capacity to handle the network traffic. Despite the fact that we aren’t typically sending as much data across the network, we are using more connections. Where we used to pull down one dynamically generated page and the associated static content (e.g. image files, JavaScript files, CSS) all of which are typically cached by the browser, we typically pulled down one page. Now we are making several AJAX calls to REST services per “page”. These REST services are generally nothing more than a light weight HTTP wrapper around a database call.

But there is one thing that we are missing completely. We aren’t testing user facing performance. When we push all that processing to the browser, that’s where our user performance occurs. I’ve seen cases where the network transfer for all the data on an SPA view takes less than 500 milliseconds, but the page took 16 seconds to render. With traditional testing this would be ignored unless someone notices it through anecdotal evidence. We can’t rely on this happening as we will typically hear, “that isn’t a production environment”, harkening back to the days when nearly everything was server side.

The state of web development as 2015 comes to a close should remind us that any paradigm shift must be viewed holistically. If development has undergone a paradigm shift the way we must review whether or testing approach also requires a shift. Developer testing (e.g. TDD and ATDD) will obviously have to change if we are to start pushing much of the processing to the browser. Developers will have to write many more tests for the front end JavaScript code than they ever have before. Fortunately most of these MV* frameworks have been written to embrace TDD and ATDD.

We must start looking at how long the browser takes to render the view. When we do this we must also look at each of the common browsers we can expect to use this view. Each JavaScript engine will execute at different speed. What works well in one browser may be incredibly slow in another. When we have a responsive design, performance can be even more problematic.

So what do we do? Most performance testing tools still focus on server response times. Some may have add-ons that will test rendering time, generally only on one browser. ATDD tools typically don’t measure performance, but can be adapted to do so. But is this the best approach?

When dealing with any paradigm shift there are three key components which need to be addressed, People, Process and Tools. For this shift the people need to change to see that server response times are only a part of the equation. The process needs to address that a significant portion of the processing is moving to the browser and that processing must be performance tested. Finally tools need to be improved or adapted to address this need.

The point is, performance testing just got a lot harder!

Related Posts:

  1. Performance is key in I.T. systems (part 2)
  2. Modelization of Automated Regression Testing, the ART of testing
  3. Testing is dead, long live the Tester.
  4. The relevance and irrelevance of testing conferences

 

 

Matthew Elmore AUTHOR:
Matthew Elmore, with 20+ years of experience in development of distributed systems, joined Sogeti Des Moines in June 2013 as a Manager Consultant specializing in Java technologies with a focus on Web Service Architecture and Design. A longtime practitioner of TDD, BDD and Agile Methodologies his practice of software craftsmanship was quickly recognized which led to him taking leadership of the Des Moines SD & I practice later that summer of 2013. While maintaining a focus on Java technologies, Matt continues to pick up 1-2 new languages and frameworks per year as he has for the last 15 or so years. This has allowed him to identify many solutions that don’t necessarily fit into the classic Java or .Net mold. Matt has developed training in TDD, BDD, paired programming and source control and continues to develop new training courses and materials. Matt led the SD & I Enterprise Architecture special interest group for 2014. Matt led the effort in the Des Moines unit to implement a more standardized technical interview process. Matt stepped away from the practice manager role for 2015 in order to focus on providing solution architecture services and to develop training to strengthen Sogeti’s technical image and delivery capabilities Matt is active in his community through speaking, volunteering, and user group participation. His speaking engagements include: I 380 Java Users Group Central Iowa Java Users Group Iowa Ruby Brigade PyOwa Python Users Group Agile Iowa Iowa .Net Users Group Iowa State University Computer Science and Engineering Club The topics he can be heard speaking on include: Agile Practices Software Craftsmanship Application Architecture Continuous Integration and Delivery Matt is a leader and technical resource for the Des Moines Charity Hackathon.

Posted in: Developers, Human Interaction Testing, Infrastructure, Innovation, IT strategy, Research, Test Driven Development, test framework, Test Tools, Testing and innovation, User Interface      
Comments: 0
Tags: , , , , , , , , , ,

 

As more IT organisations transform to an Agile workflow, architecture must transform its workflow as well. As architects, we must do a better job of recognising trends and proactively researching technologies for future adoption. While looking toward the future, we must (simultaneously) catalog, approve and govern the enterprise technology stack.

Certain elements of the Agile Manifesto are sometimes interpreted to contradict even the leanest and most agile architectural governance.

For instance:

  • Individuals and interactions over processes and tools
    • I’ve seen this to be used as an argument against any process for performing due-diligence prior to adopting or updating technologies.
  • Working software over comprehensive documentation
    • I’ve observed this to be used as an argument against cataloging the technical stack to facilitate the EA process of monitoring and governing when a technology must be upgraded or replaced altogether.

The Agile Manifesto provides excellent guidance for efficient development of quality software. The minute your organisation grows to more that one team, some level of structure and process becomes necessary to manage the scaling of Agile. This process helps to avoid the “my favorite tool is better than your favorite tool” mentality. A compromise can sometimes prevent the introduction of ‘n’ number of tools that provide the same capability. At other times, mediation and governance are required.

A leaner approach to architecture doesn’t have to preclude EA governance. On the contrary, they can complement each another nicely. It allows us to monitor our existing technical stack to update or upgrade as required. It also allows us to look to the future to decide where we should be going with our design choices.

To read the original post and add comments, please visit the SogetiLabs blog: Architecture in an Agile Age

Related Posts:

  1. Agile series: General trends
  2. Agile series: The pitfalls
  3. Apps development loves agile methods
  4. Agile series: The misconceptions

Matthew Elmore AUTHOR:
Matthew Elmore, with 20+ years of experience in development of distributed systems, joined Sogeti Des Moines in June 2013 as a Manager Consultant specializing in Java technologies with a focus on Web Service Architecture and Design. A longtime practitioner of TDD, BDD and Agile Methodologies his practice of software craftsmanship was quickly recognized which led to him taking leadership of the Des Moines SD & I practice later that summer of 2013. While maintaining a focus on Java technologies, Matt continues to pick up 1-2 new languages and frameworks per year as he has for the last 15 or so years. This has allowed him to identify many solutions that don’t necessarily fit into the classic Java or .Net mold. Matt has developed training in TDD, BDD, paired programming and source control and continues to develop new training courses and materials. Matt led the SD & I Enterprise Architecture special interest group for 2014. Matt led the effort in the Des Moines unit to implement a more standardized technical interview process. Matt stepped away from the practice manager role for 2015 in order to focus on providing solution architecture services and to develop training to strengthen Sogeti’s technical image and delivery capabilities Matt is active in his community through speaking, volunteering, and user group participation. His speaking engagements include: I 380 Java Users Group Central Iowa Java Users Group Iowa Ruby Brigade PyOwa Python Users Group Agile Iowa Iowa .Net Users Group Iowa State University Computer Science and Engineering Club The topics he can be heard speaking on include: Agile Practices Software Craftsmanship Application Architecture Continuous Integration and Delivery Matt is a leader and technical resource for the Des Moines Charity Hackathon.

Posted in: Agile, architecture, Behaviour Driven Development, Business Intelligence, Developers, IT strategy, Transformation      
Comments: 0
Tags: , , , , ,