# Reference > Extensions > Ensure

# `ensure` - Expectations & Assertions

Artillery can validate if a metric's value meets a predefined threshold. If it doesn't, it will exit with a non-zero exit code. This is especially useful in CI/CD pipelines for automatic quality checks and as a way to check that [SLOs](https://sre.google/sre-book/service-level-objectives/) are met.

#### Syntax

```yaml
config:
  plugins:
    ensure:
      thresholds:
        - 'metric.name.one': value1
        - 'metric.name.two': value2
      conditions:
        - expression: 'metric.name.one <= value1 and metric.name.two > value2'
          strict: true|false # defaults to true
```

Two types of checks may be set:

* `thresholds` check that a metric's value is less-than the defined integer value
* `conditions` can be used to create advanced checks combining multiple metrics and conditions

Any of the metrics tracked during a test run may be used for setting checks. Both built-in and [custom metrics](/guides/guides/extension-apis#tracking-custom-metrics) may be used.

> **Info:** Using a non-existent metric name will cause that check to fail

#### Threshold checks

A threshold check ensures that the aggregate value of a metric is under some threshold value.

```yaml
config:
  plugins:
    ensure:
      thresholds:
        # p99 of response time must be <250:
        - 'http.response_time.p99': 250
        # p95 of response time must be <100:
        - 'http.response_time.p95': 100
```

#### Advanced conditional checks

More complex checks may be set with conditional expressions:

```yaml
config:
  plugins:
    ensure:
      conditions:
        # Check that we generated 1000+ requests per second and that p95 is < 250ms
        - expression: 'http.response_time.p95 < 250 and http.request_rate > 1000'
```

Setting `strict: false` on a condition will make that check optional. Failing optional checks do not cause Artillery to exit with a non-zero exit code. Checks are strict by default.

##### Expression syntax

| Numeric arithmetic | Description |
| ------------------ | ----------- |
| `x + y`            | Add         |
| `x - y`            | Subtract    |
| `x * y`            | Multiply    |
| `x / y`            | Divide      |
| `x % y`            | Modulo      |
| `x ^ y`            | Power       |

| Comparisons | Description              |
| ----------- | ------------------------ |
| `x == y`    | Equals                   |
| `x < y`     | Less than                |
| `x <= y`    | Less than or equal to    |
| `x > y`     | Greater than             |
| `x >= y`    | Greater than or equal to |

| Boolean logic | Description                   |
| ------------- | ----------------------------- |
| `x or y`      | Boolean or                    |
| `x and y`     | Boolean and                   |
| `not x`       | Boolean not                   |
| `x ? y : z`   | If boolean x, value y, else z |
| `( x )`       | Explicit operator precedence  |

| Built-in functions | Description                           |
| ------------------ | ------------------------------------- |
| `ceil(x)`          | Round floating point up               |
| `floor(x)`         | Round floating point down             |
| `random()`         | Random floating point from 0.0 to 1.0 |
| `round(x)`         | Round floating point                  |

#### Basic checks

> **Info:** This way of specifying checks is retained for backwards-compatibility with
> Artillery v1, and should no longer be used for new tests.

You can check that the aggregate response time latency is under a specific threshold. For example, to check that the aggregate `p95` latency of a performance test is 200 milliseconds or less, add the following configuration to your script:

```yaml
config:
  ensure:
    p95: 200
```

In this test definition, Artillery will exit with a non-zero exit code if the aggregate `p95` is over 200 milliseconds.

You can validate the aggregate latency for `min`, `max`, `median`, `p95`, and `p99`.

You can also verify that the error rate of your performance test doesn't exceed a defined percentage. The error rate is the ratio of virtual users that didn't complete their scenarios successfully to the total number of virtual users created during the test. For instance, if your performance test generates 1000 virtual users and 50 didn't complete their scenarios successfully, the error rate for the performance test is 5%.

The following example will make Artillery exit with a non-zero exit code if the total error rate exceeded 1%:

```yaml
config:
  ensure:
    maxErrorRate: 1
```
