Reference
Metrics

Metrics

During the lifecycle of the test, Artillery reports several metrics. These metrics are aggregated every 10 seconds to form intermediate reports (and shown in the console), and then aggregated at the end for a summary report. Artillery can then also generate a JSON report at the end with all intermediate reports and the aggregate report.

In this section, you can review the different types of metrics emitted and their meanings.

Types of Metrics

Counters

Counters are a discrete measurement that adds up all occurences of that event during the measured interval.

Histograms

Histograms are used to represent the statistical distribution of a set of values during the measured interval. Because of this, all histograms share the following statistical properties:

  • min: Minimum value observed;
  • max: Maximum value observed;
  • mean: Sum of all values observed divided by the total count;
  • median: Median of all values observed;
  • p50: 50th percentile of all values observed (i.e. if measure is x, then 50% of all values were lower than x);
  • p75: 75th percentile of all values observed (i.e. 75% of all values were below it);
  • p90: 90th percentile of all values observed (i.e. 90% of all values were below it);
  • p95: 95th percentile of all values observed (i.e. 95% of all values were below it);
  • p99: 99th percentile of all values observed (i.e. 99% of all values were below it);
  • p999: 99.9th percentile of all values observed (i.e. 99.9% of all values were below it);
  • count: How many values were used to calculate the distribution.

Rates

Rates are used to represent the rate of occurence of an event during the measured interval (10 seconds).

Custom Metrics

When emitting custom metrics, you will be emitting one of the types of metrics above (counter, histogram, rate). This will automatically take care of appearing in the report.

Metrics reported by Artillery

Artillery reports the following default metrics:

MetricTypeDescription
errors.<error_name>Counter

(count)

Number of errors of that type.
vusers.createdCounter

(count)

Number of VU's created.
vusers.created_by_name.<scenario_name>Counter

(count)

Number of VU's created by that scenario.
vusers.failedCounter

(count)

Number of failed VU's.
vusers.completedCounter

(count)

Number of completed VU's.
vusers.skippedCounter

(count)

Number of skipped VU's (due to the usage of maxVusers option).
vusers.session_length.<aggregation>Histogram

(milliseconds)

How long it took for virtual users to complete each session.

Engine specific metrics

Plugin specific metrics