Non-linear viewing, such as catch-up or ondemand, is becoming increasingly significant in the broadcast world, so the success of the company’s channel 4.com website is now critical to the success of the channel. With over one million on-demand views a day and 10 million registered users, it’s essential that the site runs efficiently. This means stringent testing before services go live.
Performance testing, and in particular load testing, are crucial elements of the development process as Quality Assurance (QA) Manager Mark Smith explains: “We have a lot of very small releases and we have performance testing embedded in the development team. We work in agile development teams; our testing is specifically focused and we have two performance testers.”
As well as longer overnight tests and functional testing, the team uses short 10 to 15 minute load tests in its Bamboo continuous deployment to reveal any problems with response times or transactions per second before new services go into production.
However, testing was being hampered by the tools. Some could not provide the necessary reporting sophistication while others were difficult to script or had expensive pricing models based on consumption. The team works on continuous integration and continuous deployment models and needed a system that would provide more detailed statistics and analysis.
Channel 4’s senior performance analyst, Nicholas Godfrey, had previous experience with LoadRunner Professional. This industry-standard, performance engineering software generates real-life loads, identifies and diagnoses problems and gives developers confidence that the services they deploy will work efficiently from day one.
“I’m a LoadRunner Professional man. I’ve used it for more than 10 years and it’s excellent. A first-class product; an analysis tool to die for so it was top of my list,” says Godfrey. Because of this and the attractive LoadRunner Professional pricing model, the company contacted Micro Focus partner and test specialist Infuse Consulting. Infuse and LoadRunner Professional developers then worked with Channel 4 on a proof of concept to get the Continuous Integration plug-in for its Jenkins development automation solution up and running.
Rather than throwing millions of simulated users at the cached front-end, Channel 4 testers now isolate applications and hit the back-end servers with individual tests. Volumes are fixed according to the logs on individual services and this means that typical tests are limited to 1,000 or 1,500 threads.
There can be up to 13 application programming interface (API) releases a week, so multiple tests run at the same time, which is a lot of work for two people. This is why half of the work is now automated to reduce the manual workload.
LoadRunner Professional testing has now eradicated performance issues with new services, as Smith explains: “Previously, about 25% of our releases would have performance issues. There were recurring ones and we would find those but it would be at the end of the project just before it was about to go live. We would save most of them from going to production but we did have some that got through production and we also had a lot of delays through late testing. Now we find issues before they go down the production line which eliminates last minute testing and saves time and money. We no longer have any pre-production or production issues and we can deploy with confidence.
“Half the work is now covered by automation and that enables us to add value. This saves us employing another tester which, with wages and other overheads, would cost $186,000 a year, and I estimate that increased productivity is giving us a further $186,000 of testing for free. The total value to the company is $372,000 and that’s a conservative estimate.”
Streamlined testing means that developers are now totally engaged with the process and LoadRunner Professional is part of a plan to introduce selfservice test automation. Developers will send a message that will spark up the applications in the environment, run the load tests, and then email back the results. This means that testers will become subject matter experts, advising on performant design, analyzing performance issues, and modelling new scenarios.