Child pages
  • Running JMeter tests
Skip to end of metadata
Go to start of metadata


Using JMeter, you can simulate load on your server and observe which configuration performs best on your machine.

The test simulates load from a pre-defined number of users which will each during their sessions:

  1. logs in
  2. visit the My Projects page
  3. display the dashboard for a project (chosen randomly from the list of available projects)
  4. display 3 charts at random from the dashboard
  5. go to the action items page and views 7 action items
  6. go to the highlights tab and applies 5 highlight categories
  7. go to the findings tab and applies 5 different categories
  8. go to the dashboard, expands 3 nodes in the artefact tree
  9. log out

(Steps 4 through 8 are repeated for 3 projects during the user session.)

Projects need to exist on the server already, and the set of projects and versions should be a pre-defined set so that the test runs are repeatable and can be compared.

Ideally, you should set up projects and users on the server, take a backup of this test database and restore it after each test so you can compare results between runs.


Make sure to point on a test installation, as this procedure modifies the database content (real users will log in, which will affect the user's history).

  • Squore is installed and running
  • If you want to simulate 20 users running in parallel, you need to have at least 20 accounts that exist in your database and can access projects
  • JMeter 3.3 or later up to 5.1.1 ( is installed on a machine that will run the test
  • You have downloaded the jmeter test script and test data files for your version:
  • You are using a properties.xml that forces all dashboard tabs to be visible by default:
    <tab name="dashboard" default="true"/>
    <tab name="action-items" rendered="true"/>
    <tab name="highlights" rendered="true" />
    <tab name="findings" rendered="true" />
    <tab name="reports" rendered="true" />
    <tab name="attributes" rendered="true"/>
    <tab name="indicators" rendered="true"/>
    <tab name="measures" rendered="true"/>
    <tab name="annotations" rendered="true"/>

from Squore 20.0:

        <tab name="dashboard" default="true"/>
        <tab name="action-items" rendered="true"/>
        <tab name="highlights" rendered="true"/>
        <tab name="findings" rendered="true"/>
        <tab name="documents" rendered="true"/>
        <tab name="attributes" rendered="true"/>
        <tab name="indicators" rendered="true"/>
        <tab name="measures" rendered="true"/>
        <tab name="annotations" rendered="true"/>

Configure users

The list of users (and their passwords in clear text) that will be used during the test needs to be specified in users.csv as follows:


The CSV format is: "user,password", with one line per account.

Either create dedicated users for the tests, or use existing users (preferred way).

Running the Test

To trigger a load test with 20 simulated users, execute this command:

<jmeter_home>/bin/jmeter(.sh|.bat) -n -t <package_dir>/load.jmx -l <package_dir>/results.jtl -q <package_dir>/ -Jserver.port=8180 -Jthreads=20

Available parameters:

 * - Squore server host
 * server.port - Squore server port number
 * threads - number of users to simulate
 * threads.rampup - time to get all the threads started, in seconds. Note that it is not realistic for all the load to be sent at the same time, so consider starting one thread per second by setting rampup to the same value as the number of users in your test
 * loops - number of times to perform the test case (default: 1)
 * duration - duration of the tests, in seconds (default: 1800)
 * think.time - delay in ms between user actions (default: 7000 ms)

Note that you can also specify parameters in the file instead of the command line.

The test runs until either the number of loops is reached or the duration is reached - whichever occurs first.

You are strongly encouraged to customize the threads.rampup parameter to get relevant test results. If all threads start in a short period of time, they will all be in the same step at the same time, which is probably not what you want. For example, use this formula: threads * think.time / 1000. In addition, this avoids some concurrency issues on the server side if you use a high number of threads with a few users only.

Take care that successive commands append samples output to the test.jtl file.

There is a log directory created in your working directory with all encountered errors (either from the Squore server, or from the JMeter test plan). You can send it to for analysis, with the server.log file.

Analysing JMeter Results

While the test is running, you will get a summary of the number of requests sent, the min, max and average response time, as well as the error rate and the number of active/finished threads:

Starting the test @ Mon Feb 05 18:26:14 CET 2018 (1517851574410)
Waiting for possible Shutdown/StopTestNow/Heapdump message on port 4445
summary + 14 in 00:00:15 = 0.9/s Avg: 711 Min: 66 Max: 3064 Err: 0 (0.00%) Active: 1 Started: 1 Finished: 0
summary + 65 in 00:00:30 = 2.2/s Avg: 201 Min: 42 Max: 674 Err: 0 (0.00%) Active: 1 Started: 1 Finished: 0

The console will print these messages at the end of the run:

Tidying up ... @ Mon Feb 05 18:27:19 CET 2018 (1517851639379)
... end of run

When the test run is complete, you can consult the detailed results in results.jtl, which is a CSV file containing detailed data about each request made against the server during the test. All time related columns are in milliseconds.

You can exploit the results in Excel or generate a HTML report to view the results, as described in the JMeter documentation here.

  • No labels