I am looking forward to share my thoughts on ‘Reinventing Performance Testing’ at the imPACt performance and capacity conference by CMG held on November 7-10, 2016 in La Jolla, CA. I decided to publish a few parts here to see if anything triggers a discussion.
In more and more cases, performance testing should not be just an independent step of the software development life-cycle when you get the system shortly before release. In agile development / DevOps environments it should be interwoven with the whole development process. There are no easy answers here that fit all situations. While agile development / DevOps become mainstream nowadays, their integration with performance testing is just making first steps.
Integration support becomes increasingly important as we start to talk about continuous integration (CI) and agile methodologies. Until recently, while there were some vendors claiming their load testing tools better fit agile processes, it usually meant that the tool is a little easier to handle (and, unfortunately, often just because there is not much functionality offered).
What makes agile projects really different is the need to run a large number of tests repeatedly, resulting in the need for tools to support performance testing automation. The situation started to change recently as agile support became the main theme in load testing tools. Several tools recently announced integration with Continuous Integration Servers (such as Jenkins or Hudson). While initial integration may be minimal, it is definitively an important step toward real automation support.
It doesn’t look like we may have standard solutions here, as agile and DevOps approaches differ significantly and proper integration of performance testing can’t be done without considering such factors as development and deployment processes, system, workload, ability to automate and automatically analyze results.
The continuum here would be from old traditional load testing (which basically means no real integration: it is a step in the project schedule to be started as soon as system would be ready, but otherwise it is executed separately as a sub-project) to full integration into CI when tests are run and analyzed automatically for every change in the system.
Automation means here not only using tools (in performance testing tools are used in most cases), but automating the whole process including setting up an environment, running tests, and reporting / analyzing results. However “full performance testing automation” doesn’t look like a feasible option in most cases. Using automation in performance testing helps with finding regressions and checking against requirements only – and it should fit the CI process (being reasonable in the length and amount of resources required). So large-scale, large-scope, and long-length tests would not probably fit, as well as all kinds of exploratory tests (as explained in the Agile part). What would be probably needed is a combination of shorter automated tests inside CI with periodic larger / longer tests outside or, maybe, in parallel to the critical CI path as well as exploratory tests.
Other abilities of load testing tools are also important for proper integration – such as cloud integration, support of new technologies, integrated monitoring and analysis. Cloud integration (including public clouds, private clouds, and cloud services) simplifies deployment automation. Support of new technologies minimizes the amount of manual work needed. Integrated monitoring and analysis allow to collect information and evaluate results of performance tests (which may be quite sophisticated).