Performance has become critical to the success of websites, and of e-commerce sites in particular. With customers expecting web pages to load increasingly faster, they will often lose patience, especially in a purchase process, if they have to wait for too long. The key question is figuring out how long each of the various pages can be allowed to take – at maximum – to load and present relevant content in the browser.
This leads to the next question: what is the relevant page content? What are the must-have elements, and which functions have to be accessible in order for the page to catch and retain the user’s attention? At OTTO we have been engaging with these questions for a number of years, and have made some interesting discoveries during this time.
In this process, iteratec has been assisting us in designing the concept and taking the system into practice. Our collaboration over the past two years has produced a tool for measuring the performance of websites: the OpenSpeedMonitor, which we would like to introduce here.
Uwe Beßle and Nils Kuhn from iteratec will be presenting the tool on November 18-th at the Velocity Europe Conference event. In this first part of our interview, they give us an overview of the OpenSpeedMonitor tool and a taste of their Lightning Talk in Barcelona.
Oliver: When and how was the idea born to develop a tool for measuring the performance of websites? After all, there are a range of tools on the market already, both commercial and open source.
Nils and Uwe: The idea originated from working on a regular basis with the solutions available on the market. As part of a project for continuous performance optimisation of www.otto.de, we and other stakeholders selected the tool “WebPagetest” for measuring and analysing performance. WebPagetest is a terrific open source tool that enables highly detailed, true-to-life data to be captured in a real browser. Unfortunately, though, it did not have any built-in functionality to ensure these measurements are performed regularly or to analyse trends. These are features we urgently needed for continuous monitoring of the performance of otto.de.
There is in fact a WebPagetest add-on, that would provide the regular monitoring features we sorely missed. But this software, the open source WPT Monitor, was unfortunately never able to match the quality and success of WebPagetest. The most recent release, Version 0.3.0, dates back to 2010. A number of weak points were evident right from the first rollout, such as the very restricted functionality for user and rights management, the inability to precisely control the execution time of the measurements, as well as a degree of instability in operation.
Over time, the list of missing features and problems owing to the lack of evolutionary development became longer and longer. To give just one example, many of the options available for individual measurements in WebPagetest are missing in WPT Monitor, and therefore it is not possible to configure them into the jobs for regular measurements.
OTTO then approached us with the request to provide a solution for automated calculation of the Customer Satisfaction Index (CSI), an indicator of customer satisfaction based on measured load times. We decided not to integrate this functionality into the code of the WPT Monitor, which was not undergoing any further development. Instead, we began developing a new solution from scratch, with the objective of ultimately delivering a better, actively maintained alternative to the WPT Monitor. This was the starting point for the development of the OpenSpeedMonitor. By the way, it was clear from the outset that iteratec would not be entering the product business, but that we would ultimately make the solution available as open source software. Over the time since then, we’ve integrated a lot of the special requirements from OTTO, such as the CSI calculation, into the OpenSpeedMonitor.
Oliver: What functions are part of the OpenSpeedMonitor, and what differentiates this software from other tools on the market?
Nils and Uwe: There are a number of features in the OpenSpeedMonitor we are proud of, and which enable us to leverage the established WebPagetest infrastructure to the maximum. It begins with syntax highlighting and code completion in the editor for measurement scripts.
Figure 1: Script editor with syntax highlighting and code completion
It continues with the possibility to parameterise scripts within each job and the ability to set the execution time of measurement jobs flexibly using crontab expressions.
Figure 2: Flexible job scheduling with cron strings
Last but certainly not least OpenSpeedMonitor provides the ability to configure required network connection profiles for measurement.
A key part of the software, and an area where we have actually enriched the functionality of the WebPagetest tool itself, is support for multi-step measurements that follow a complete customer journey, measuring and logging each individual page view separately. That might be an ordinary feature for some commercially available tools such as Keynote or Gomez. But for the vast number of people who use WebPagetest to measure site performance, this is a breakthrough improvement over the status quo.
For one thing, the agent infrastructure can be used far more efficiently if an entire customer journey is taken and measured in each process. In addition, and far more importantly for practice, is that the First and Repeated View measurements of WebPagetest do not reflect the customer‘s reality because they are unable to properly depict the real caching effects in the browser. The First View gives an excessively negative picture of load times, since the measurement is taken with an empty browser cache. In reality, this is a rare situation. Starting with the second page view in the customer journey, the circumstances are very different, because the browser cache is not empty any longer. All commonly used artefacts can be loaded from the Browser cache now.
In contrast, the Repeated View presents an overly optimistic picture, because an identical page is viewed for a second time unchanged, something that web users will rarely do. Analysis of http://www.otto.de access statistics shows that these two cases account for little more than 5% of the page views of OTTO customers. Thus in about 95% of all customer page views, the real performance lies somewhere in between. By modeling a typical customer journey in our measurements we can depict reality far better.
Figure 3: Compare Performance of the usual first (dark blue) and repeated view (green) with a standard view in a customer journey (lighter blue)
One unique outstanding feature of the OpenSpeedMonitor is, that it calculates business metrics such as the Customer Satisfaction Index (CSI) on the basis of the measured page load times, essentially converting each page load time to a percentage expression of customer satisfaction. In other words, we measure the percentage to which OTTO customers would have been satisfied with the page load time.
Oliver: Can you give us some details on the customer satisfaction index?
Nils and Uwe: Everyone knows that page performance has an impact on customer satisfaction. However, using page load times to draw reliable conclusions on customer satisfaction is anything but straightforward. The few simple yardsticks on the market – such as “a one-second deterioration in load time causes a 7% loss of sales revenue” – might be readily understandable, but fall way short of reflecting reality. Unfortunately, the correlation is non-linear. The performance expectations of customers are contingent on the pages themselves. Expectations increase as greater use is made of http://www.otto.de by customers. And the situation is evolving. Today, OTTO customers are generally more impatient than they were two years ago.
OTTO has carried out extensive studies on this question and collected substantial data on the correlation between page load times and customer satisfaction. This investigation has recently been repeated in order to obtain fresher data on customer expectations. As a result, we now have conversion tables that can tell us the customer satisfaction percentage for any given page load time.
Figure 4: non linear dependency between load time and customer satisfaction
Using conversion tables such as this, the OpenSpeedMonitor can convert the measured load times to the CSI of particular pages and combine these figures – based on further weighting factors – to produce the CSI of an entire web application. This feature is unique to the tool. No other performance tool on the market can aggregate performance metrics at this level.
The data for converting load times to customer satisfaction percentages, and for weighting the various pages in the index as a whole, is specific to each web application; in this particular version with these particular calculations it is of course the property of OTTO.
Oliver: How do you organize the development of the software? Can I download the software via github?
Nils and Uwe: Yes, in this respect OpenSpeedMonitor is like any other tool. A small team at iteratec takes care of development, based on the Grails framework. Nils is the lead developer, and also ensures that contributions from the various stakeholders are integrated smoothly. We frequently get young members of the iteratec team involved in development, and also interns and trainees.
The sources are all managed in a Git repository. With the OpenSpeedMonitor being open source software, we will make the source code available to the public via github.
A Jenkins server is used as the basis for the automated build processes. For automated testing, we use the extensive Grails base, which in turn is based on JUnit and Spock.
As usual for iteratec we use an agile development process. Due to the fact, that we have a distributed team, we use an electronic taskboard. Pending tasks are all managed with JIRA and the GreenHopper extension for agile projects. Here we’re able to make use of the vast amount of experience iteratec has acquired in other projects.
Oliver: What message do you want to give to the audience during your talk at the Velocity Conference?
Nils and Uwe: Our talk at the Velocity will be a lightning demo. So our focus will be to demonstrate the tool and how it can be used.
The main message will be, that WebPagetest is still a great tool to measure website performance, and with OpenSpeedMonitor people have a tool available right now to automate WebPagetest measurements in a practicable way, and make greater sense of the metrics.
This interview will be continued
- Part 2 : How OpenSpeedMonitor is used in practice at OTTO
- Part 3 : Whats next ? – OpenSpeedMonitor Roadmap