16. December 2020 3 min read

Limit number of allowed failing Robot framework tests in your project

We can all agree that sometimes you just remember a valid test, but your implementation is not on level to pass it. Now you face a dilema: do I commit a test and fail the build, or I waste some more time to implement a solution as well? The usual answer is: implement a solution, mainly because most of the test frameworks fail the build, if there is a failing test. Failing test to fail the build is logical - it prevents regression, but in our case we only need to fail if any other test fails and for that we can use Melexis Warnings plugin, that enables you to set a limit of how many tests are actually expected to fail. It does not matter what you use for building, from GitHub Actions, to GitLab CI, Travis CI, Circle CI, ... there is always a place for the command line tool, that will fail your build when number of warnings is higher (or lower) then expected.

Setup warnings limit for robotframework test suite

First you need to install the tool through pip in your CI file (or just install it in your Docker container)

pip3 install mlx.warnings

Plugin has two options of usage. The easiest is with command line option, where you add your robot framework build command right after with minimum and maximum number of warnings allowed (or you can use exact number of warnings flag). That basically means any change in number of robotframework warnings now fails the build. To set this we need to you use the json file. Content of the json file should look like for the My First Suite and My Second Suite test suite names:

{
    "robot": {
        "enabled": true,
        "check_suite_names": true,
        "suites": [
            {
                "name": "My First Suite",
                "min": 8,
                "max": 10
            },
            {
                "name": "My Second Suite",
                "min": 0,
                "max": 0
            }
        ]
    }
}

and the command for the robotframework build would be:

mlx-warnings --config paith/to/library/config.json output.xml

This is your solution of writing a test case which currently fails, but still a safe way which prevents you from adding more failing tests or even worse: miss a regression. Maintain the control, but also enhance your testing with the help of a simple command line tool to keep failures at exact number.

Newest from this category: