Self-Writing Test Cases – Angie Jones
post-template-default,single,single-post,postid-3582,single-format-standard,eltd-core-1.0,flow-ver-1.3.1,,eltd-smooth-page-transitions,ajax,eltd-grid-1300,eltd-blog-installed,page-template-blog-standard,eltd-header-vertical,eltd-sticky-header-on-scroll-up,eltd-default-mobile-header,eltd-sticky-up-mobile-header,eltd-dropdown-default,wpb-js-composer js-comp-ver-4.12,vc_responsive

Self-Writing Test Cases

Self-Writing Test Cases

Test cases are important. They provide step by step instructions on how to validate a feature. They are essential for cross-training and necessary for auditing. They also take quite a bit of time to write, do not seem to be testers’ favorite activity, and instantly become outdated whenever the application changes.

The solution to this is to develop your automation in such a way that it generates a test case that includes steps that were taken each time the test executed.

Your automation is running frequently, so it’s constantly checking to make sure the steps are still correct. If by chance, the application has changed, the test will fail and you’ll know right away. Ideally, you would update the automation script and then go to your test case repository and update the written test. We all know that this is a step that is frequently skipped in the process, resulting in out-of-date test cases. By having the test case generated by the automation execution, it’s a step you no longer have to worry about.

Here’s how to set up such a logging framework.


A great logging framework to use for outputting is log4j bound to Simple Logging Facade for Java (SLF4J). Add the following dependency to your maven pom.xml. This will pull down the jars that’s needed and allow you to import the logging classes into your test classes.



Within a base class that all of your test classes will inherit from, import Logger and LoggerFactory:

Then add a method to the base class that will be used to print out a description of each test step:



While writing your test script, you’ll need to make a call to the stepInfo() method before each action that you’re taking. The code below resides in a test class that inherits from the base class.

Notice that the username variable is included in the output. This is intentionally not hardcoded. Using this approach, the test output will always reflect the correct data no matter what source the username is populated from.

At first, remembering to do this before each action will take a bit of getting used to, but it quickly becomes second nature.


What we have so far is a basic reporting structure which will give you your self-writing test cases. However, if you want something even more advanced, you can create a new utility class with several different logging methods that outputs data in different styles based on what type of output it is (e.g. step, verification, etc). This improves the readability of the report/test case.

Then you can call these methods throughout your framework to output data in the right style. For example, let’s revisit the base class and change the stepInfo method to use ReportUtils

We can get fancy and even automatically add a step number before each printed step:

In your TestNGListener (or whatever wrapper class you’re using for error/failure handling), you can add a line to report errors in red:

For more micro-logging, you can add additional logging to all of your framework classes, so that information about each step is reported. Here’s an example within a class that’s using the Page Object Model

When running a test, we get a beautiful test case something like this:

Step 1. Log into application as Angie
Set username Angie
Set password passw0rd
Click button Submit
Verify: Home page displayed

This HTML report can be stored how you see fit. There are several test case repositories that provide APIs for adding/updating test cases. Or perhaps, you want to save these files off to a directory when executed. Either way, they can serve as an auto-generated up-to-date test case.


  • Going back and adding this to an existing framework may be quite tedious. While it’s definitely possible, this would certainly be less painful to implement for new frameworks.
  • For this to truly take the place of manually written test cases, automation engineers must be diligent about adding the logs before each action taken. This is not optional! Also, whenever automation code is changed, the engineer must also review the logging statement to see if it also needs to be updated. A good practice is to read the report file when testing the automation script to ensure it’s accurate.
  • It may seem like overkill to have to call these logging methods before every action, but it truly becomes second nature. I used this approach with a team of 10 automation engineers and 99% of the time, they remembered to add their logs. The other 1% was caught in code reviews 😉
Angie Jones
  • Susie Tyler

    Hi Angie,

    Forgive me if this is a newbie question, but does this apply to test cases written for regression testing, or for any/all testing carried out during the development cycle?

    I ask because I am researching the possibility of moving to a checklist method of testing during development, to save time (so much time spent writing detailed test cases) as we are very short of testers. I also need to automate the regression test suite at some point so this article is very relevant. I’d be interested in your opinion, if you have the time.


    January 26, 2017 at 5:02 am Reply
    • Angie Jones

      Susie, this would work for any level of testing 🙂

      January 26, 2017 at 5:05 am Reply

Post a Comment