Are Times still Good for Load Testing?
My post Good Times for Load Testing was published in 2014. It is difficult to believe that 5 years passed… Are times still good for load testing? Well, yes and no. I am not so upbeat as I was in 2014. If we speak about commercial load testing tools, we see rather a shrinking market and not too much innovation recently. If we speak about open source tools, we definitely see some improvement – but in many areas they are still behind the best commercial tools (yes, many may not need these advanced features and yes, often there are other ways to do it – but when you need them, it is a problem) and it doesn’t appear that we see many (if any) new interesting products.
Let’s see what is going on with load testing tools (a few facts and my personal interpretations of the limited information I have) and then try to understand why (pure speculations). The views expressed here are my personal views only and do not represent those of my current or previous employers. All brands and trademarks mentioned are the property of their owners.
Commercial Tools [Crisis?]
If we talk about commercial vendors targeting the corporate market, I guess only Neotys remains independent and active (I mean contributing into the performance testing ecosystem – innovation, publishing content, participating in events, etc). And yes, you still can get Neoload free edition.
Let me at once apologize to multiple smaller independent vendors for not mentioning here: I know that many exists and offer great products for some specific areas, but I don’t hear much about them in the enterprise market and don’t see much contribution in the performance testing ecosystem.]
Microfocus acquired both LoadRunner from HP and Silk Performer (as a part of Borland). Actually Microfocus earlier acquired also QALoad from Compuware – but there is no trace of QALoad anymore. While both LoadRunner and SilkPerformer appear to be alive and well, it is not quite clear how Microfocus plans to handle the two competing products. Other aspects are not quite clear too: you may still see LoadRunner Community edition – but when you click on it, you get to LoadRunner free trial. And it appears that quite a few people left the LoadRunner team.
IBM Rational Performance Tester and IBM Rational Performance Tester on Cloud (including no charge Starter Edition) appear to be available. However, the current version mentioned is 9.2 released in March 2018 – so it doesn’t look like a lot is going on there.
SOASTA was acquired by Akamai and, while it is still possible to find CloudTest, I practically haven’t heard about it since the acquisition. At one point SOASTA gathered a great team of performance experts – almost all left the company by now.
Microsoft announced that cloud-based load testing in Microsoft Visual Studio and cloud-based load testing in Azure DevOps will be retired.
Oracle Applications Testing Suite (OATS) entered sustainability mode stage.
Open Source
It appears that Apache JMeter became the most popular load testing tool. In 2014 I was preparing a presentation about load testing tools and criteria for their selection. One criterion was the existence of ecosystem (documents, expertise, people, services, etc). It may be not the defining factor, but it is an important factor to consider. To evaluate such ecosystems, in absence of more sophisticated data, I used the number of documents Google finds and the number of jobs Monster finds mentioning each product.
LoadRunner (then an HP product) clearly had first place in both categories with JMeter following not too far. SilkPerformer (then Borland) trailed third far behind. But when I check it in 2018, JMeter appears to be well ahead of LoadRunner (now MicroFocus) in both the number of documents and the number of jobs mentioning it, apparently becoming the most popular load testing tool. Of course, it doesn’t mean that JMeter became the best tool for every task, but its popularity, in addition to being an open source tool, definitely gets it high in the list of options to consider. It is also very important for an open source project to attract people who will work to improve it, thus ensuring the future of the product.
Another interesting trend is that JMeter scripts become a de facto standard, and many SaaS tools are built on the top of JMeter or at least support JMeter scripts (some also support Gatling, which appears to be the #2 open source tool). These tools complement JMeter (and Gatling) in many important ways and bring its functionality and services to a new level, allowing to compete with commercial products in more sophisticated environments.
We see acquisitions in the commercial sector on the top of open source too. BlazeMeter, was acquired by CA, which, in turn, was acquired by Broadcom. It was still active for a while (they authored a good deal of advanced JMeter content on the Internet), but it appears that a lot of people left the company recently.
The theory about acquisitions is that the bigger company will invest more and will provide more opportunities for the smaller acquired company. Unfortunately it doesn’t appear that any of above mentioned acquisitions worked quite that way.
One possible exception may be Flood acquired by Tricentis – at least it appears that they are growing, not letting people go.
There are other small companies adding value to open source – for example, OctoPerf, RedLine13, UBIK, XMeter, and Loadium.
JMeter gets closely integrated with other DevOps tools – and we have a lot of great content about JMeter integration into DevOps – up to recently published Master Apache JMeter. From load testing to DevOps book by Antonio Gomes Rodrigues, Philippe Mouawad, and Milamber.
The Gatling team got a commercial offering Gatling Enterprise on the top of the open source product.
There are several other promising open source tools – such as Locust, k6, Tsung, Taurus – but it doesn’t look like we have any major breakthrough here.
Still open source tools are well behind the best commercial tools in some areas. The commercial offering built on the top of open source tools close some gaps. Another trend is to integrate them with other products to provide needed functionality – such as monitoring, analysis, and reporting. Often they get integrated with the TICK stack (Telegraf, InfluxDB, Chronohraf, Kapasitor), the ELK stack (Elasticsearch, Logstash, Kibana), Prometheus, and Grafana.
It is interesting that discussions nowadays are not about load testing tools anymore – but rather about performance testing frameworks. Where framework is a specific combination of products that allow meaningful performance testing. Basically it replaces the old notion of a load testing tool covering all your load testing needs by the notion of a load testing component that should be complemented by a set of other tools to provide the full service. Unfortunately the current level of integration rather remind me this picture:
What Is Going On ?
We probably can separate overall performance testing trends and load testing tools trends.
I shared my thoughts on current performance testing trends in more details in my Context-Driven Performance Engineering and Shift Left, Shift Right – Is It Time for a Holistic Approach? posts.
If summarize it in a simplified way, my impression is that performance testing getting somewhat less popular due to some objective reasons (for example, we are getting other ways to mitigate performance risks) as well as some subjective reasons (the companies setting fashions in the high-tech industry are somewhat less sensitive to performance and reliability risks and mostly highlight other ways of mitigating performance risks – so load testing became less sexy). It may be less need for simple load testing due to increased scale and sophistication of systems – but it appears that companies don’t want to invest in more advanced ways of performance testing tightly integrated into holistic performance engineering activities (even the idea itself sounds alien to many).
If we talk about load testing tools trends, we definitely see that large companies don’t want to invest much in loading testing tools. Why? In addition to mentioned above performance testing trends, possible contributing factors may be:
- development of enterprise-level tools requires a lot of efforts and investments (with rather shrinking commercial market);
- open source tools provide increasing competition;
- lucrative enterprise market, requiring sophisticated tools, has entrenched incumbents (first of all LoadRunner);
- almost all products are Web-based nowadays – and some popular technologies don’t require sophisticated tools.
Any thoughts?