Dockerised Jmeter + Grafana + Influxdb


Jmeter is a widely used load testing tool from Apache. It is open source.

Influxdb is a widely used time series database from InfluxData. It is open source.

Grafana is a widely used graphing and analytics engine from Raintank inc. It is open source.

Docker is a platform designed to containerize applications and services. It has a free tier but is not open source.

Jmeter has always lacked the same richness of features provided by many of its competitors in terms of user interfaces, usability and accessibility. In my opinion, it has always particularly lacked the results analysis and reporting capabilities found in such tools as Loadrunner for example.

It does provide raw results in common formats, but it has always been left to the performance test community to implement any analysis functions within a jmeter-driven framework.

This is undoubtedly a limitation, but at this price point, it feels rude to complain.

Influxdb provides a natural fit for the output generated from a jmeter test. Not least because the primary key of an influx database table is a timestamp.

Influxdb also contains some rudimentary graphing functionality – I’ve not investigated how deep this actually goes but it is certainly sufficient for debugging purposes (i.e Confirming data is present as expected).

Grafana sits on top of Influxdb, consuming from the raw data in the database table and draws some pretty graphs, tables and other “visualisations”.  

Docker is a mechanism by which the above applications can be run on any machine without needing software installations and configurations. It uses containers – like a virtual machine – to run the applications on demand.

As part of a longer term project around a results comparison engine, I was tasked with building a small test framework – using jmeter – to provide a consistent and repeatable results-set initially.

This simple test makes no consideration of the expected load for the system, and hence no consideration of distributed testing has been made at this time. This is one of a number of further necessary enhancements.  

The initial setup of the solution used a locally installed influxdb, jmeter 5.6.2 and Grafana instance.

The jmeter distribution used is the vanilla 5.6.2 with no additional plugins (a second necessary enhancement).

The influxdb installation used the installer package: influxdb2-client-2.7.3-windows-amd64.

The local instance of Grafana was configured to run on port 3001 to prevent confusion with the docker instance. This was not strictly necessary.

  1. The test runs against the application host – in this case blazeDemo.com.
  2. The application host processes the requests
  3. Results data is written on-the-fly to the influxdb database.
  4. Grafana consumes the data on-the-fly
  5. The user is presented with a dashboard containing pre-configured graphs of the activity.

Docker-ising the Solution.

The obvious question is Why?

In principle as part of my work, it would be useful to be able to roll out a pre-configured pre-packaged solution as a proof of concept. The solution can be run on any machine with docker capabilities and adding new test scripts and test scenarios into this model is sufficiently easy that it could be considered a worthwhile effort.

The obvious downside is that the test scripts will still need to be prepared and formalised into a test scenario with all the necessary datafiles and logs etc. Baselining activities are still required. Results and Reporting are still required.

It seems to me that the ability to run the test in a dockerised container is a solution to a problem that won’t exist until the performance test capability has reached a level of maturity. At which point, the test mechanisms will be so well established that this solution will be largely irrelevant.

On the other hand, it does provide a storage solution for a performance test capability, and allows for less frequent, less high-profile performance testing to be undertaken with a minimum of resources and effort.

Essentially, you can shutter and box up the performance testing and then unbox as and when required.  This may not fit the model of many of my clients who tend towards rapid, high profile repeated test cycles with many release and truncated delivery windows. (Otherwise known as Agile delivery, but that’s a different story.)

The next obvious question is How?

Installation of the docker desktop application is the first step.

The following is based upon the work done here: https://github.com/jlight99/docker-jig and documented here by Ellen Huang: medium article though it is worth noting that I have extended the suite beyond that provided there.

Using docker compose, we are able to specify the 3 services – Influx, Jmeter and Grafana – in a yaml file as shown below:

Within the declaration of the services, note we are using a standard distribution of Influx where our only configuration change is the name of the database.

The jmeter configuration is also using a standard distribution – in this case a vanilla build of v5.5.  

It is worth stating that the configuration also allows for the manipulation of test settings – shown here are the settings for the test scenario and the the ramp up to be used.

Clearly in order for this to work the scenario needs to be configured with placeholder variables in situ for handling the value, and also the absence of the value. In this case, we have a user defined value section with the declaration below:

Elsewhere in the configuration is a user properties file.

This is accessed within Jmeter in a similar manner:

Since the test .jmx will need to be pre-configured and uploaded to docker as part of the execution, it may be possible to extend the jmeter version to include plugins simply by declaring them as part of the test. This is out of scope for v1.0 of this solution.

The Grafana declaration is also as standard but contains code to establish a default collection point – in this case influxdb:8086 and to setup a default dashboard.

The dashboard out of the box is limited to a view of the activity against the target application, there are no transactions or additional measurement points as standard. The tester is free to setup the visualisations as required and to update the json file which defined these graphs however.  

This is configured in the \grafana\dashboards\jmeter-cnt.json file.

Execution of the test

To build and run the pack, navigate to the location of the pack on the local machine in a shell.

Enter “docker-compose up”

Following the test, ctrl + c will stop everything, though “docker-compose down” is also provided.

I have found that though the jmeter test duration can be limited and the docker container will stop for this, the Grafana and influxdb containers will continue and can only be stopped in this way.

They are not required to be stopped prior to an additional test, but additional shell windows will be spawned each time which is less than ideal.

Stopping all containers and rebuilding is necessary if significant changes are made to the config, and this seems potentially prone to error. Neglecting to rebuild prior to running a test will mean an inappropriate test is executed and this likely will not be notice until the test has completed at the earliest and potentially much later.

The Grafana dashboard is available at localhost:3000 to allow monitoring to occur.

The working version of this is available here: https://github.com/automationsolutions-org/docker-jmeter


In Conclusion

I find myself strangely ambivalent about this solution. It may be that is simply because I don’t have a use case for it in the real world, though I can absolutely see how it could fit with an established ci/cd pipeline where docker is already in situ.

To me, there are question marks over the use of docker as I’m not sure it solves a problem I’ve encountered in my 20+ years.

The distribution and use of jmeter has simply never been an issue, and though the results and reporting are not perfect out of the box, there are question marks over the use of grafana as well for that matter, not because there is anything fundamentally wrong with the software, but because it doesn’t address one of the fundamental needs of any performance test capability – the requirement to compare the results today to the results from yesterday, or last week or last year.
Influx will store that data, but until we have a better mechanism for comparing results historically, this solution seems limited.

What is does provide however is a useful foundation for expansion, there are a number of issues – some minor, some major – to address which if handled appropriately could go some way toward making this a useful tool.

These are the issues I’ve identified and will see to address

  1. Lack of support for distributed testing.
  2. Potential issues around plugin usage – investigation required.
  3. The results are stored within the docker container, which makes reporting more difficult than say mounted upon a shared drive.
  4. Limited or No ability to compare results directly.
You can leave a response, or trackback from your own site.

Leave a Reply

Powered by WordPress and ThemeMag