It is often the case that code changes affect the performance of the code. While a bug fix might fix a bug, it might slow down performance in the process. So called "optimizations" and refactorings may actually slow down performance rather than improve it. Mostly, it is not typically tested, and therefore not recognized when each chunk of code is checked in, whether it had any effect on performance whether good or bad!
So, one could extend the JUnit framework to track the performance history of each test.
This could be simple and lightweight, storing performance numbers in local text files, or it could support a database driven framework with lots of fancy historical analysis.
It could track the min/max/weighted-average/weight of the real-time and cpu seconds to execute each test. It could flag any test whose execution performance has changed dramatically from the norm. Refactorings that did not affect functionality (but did affect performance) can be noted this way.
* A custom JUnit fixture (following JBuilder practice)
* that times each test run and accumulates the data in
* a flat comma-separated-value file (to be easily analysed
* via a spreadsheet).
* @author Bruce Wallace
* @see "JBuilder 7 Building Applications pg 9-6"
public class TimerFixture
final static String kPrefix = "TestData_";
final static String kSuffix = ".csv";
final static String kSep = ",";
public TimerFixture(Object obj)
fTest = (TestCase) obj;
public void setUp()
fTime = System.currentTimeMillis();
public void tearDown() throws Exception
// get data to log
long now = System.currentTimeMillis(); // GET THIS FIRST!!
long elapsed = now - fTime;
String date = DateFormat.getDateTimeInstance
( DateFormat.SHORT, DateFormat.SHORT )
.format( new Date(now) );
String name = fTest.getName();
//System.out.println("\nTest: "+name+" elapsed time:"+elapsed );
// append info to test data collection file
String fname = kPrefix + name + kSuffix;
PrintWriter out = new PrintWriter( new FileWriter(fname,true) );
out.println( date +kSep+ name +kSep+ elapsed );