Tuesday, December 24, 2002

Add Performance Tests to the Unit Test Paradigm

Of the types of automated unit tests (such as are typically implemented in frameworks like JUnit), I think a worthy addition would be performance tests.

It is often the case that code changes affect the performance of the code.  While a bug fix might fix a bug, it might slow down performance in the process.  So called "optimizations" and refactorings may actually slow down performance rather than improve it.  Mostly, it is not typically tested, and therefore not recognized when each chunk of code is checked in, whether it had any effect on performance whether good or bad!

So, one could extend the JUnit framework to track the performance history of each test.
This could be simple and lightweight, storing performance numbers in local text files, or it could support a database driven framework with lots of fancy historical analysis.

It could track the min/max/weighted-average/weight of the real-time and cpu seconds to execute each test. It could flag any test whose execution performance has changed dramatically from the norm. Refactorings that did not affect functionality (but did affect performance) can be noted this way.

Something like...

import java.io.*;
import java.util.*;
import java.text.*;
import junit.framework.*;

/**
 * A custom JUnit fixture (following JBuilder practice)
 * that times each test run and accumulates the data in
 * a flat comma-separated-value file (to be easily analysed
 * via a spreadsheet).
 * @author Bruce Wallace
 * @see "JBuilder 7 Building Applications pg 9-6"
 */
public class TimerFixture
{
  final static String kPrefix = "TestData_";
  final static String kSuffix = ".csv";
  final static String kSep    = ",";

  long     fTime;
  TestCase fTest;

  public TimerFixture(Object obj)
  {
    fTest = (TestCase) obj;
  }

  public void setUp()
  {
    fTime = System.currentTimeMillis();
  }

  public void tearDown() throws Exception
  {
    // get data to log
    long now     = System.currentTimeMillis(); // GET THIS FIRST!!
    long elapsed = now - fTime;

    String date  = DateFormat.getDateTimeInstance
                 ( DateFormat.SHORT, DateFormat.SHORT )
                  .format( new Date(now) );

    String name  = fTest.getName();
    //System.out.println("\nTest: "+name+" elapsed time:"+elapsed );

    // append info to test data collection file
    String fname    = kPrefix + name + kSuffix;
    PrintWriter out = new PrintWriter( new FileWriter(fname,true) );
    out.println( date +kSep+ name +kSep+ elapsed );
    out.close();
  }
}

Wednesday, December 18, 2002

Add main() documentation to JavaDoc

Open Letter to JavaDoc group at Sun...

There is a major aspect of Java programming that JavaDoc does not cover, and that is documenting standalone programs; i.e. main() methods.

The reason simply adding JavaDoc comments to main() is not sufficient, is that while the program may be implemented inside some internal package, running the program from the outside world's perspective is a public thing.

There needs to be an "Applications" section up top in the overview page along with the packages summary.  In this section, all main() methods that are being "published" can document their calling sequence (aka command line parameters), the formats of any input and output files, the list of and meaning of the process exit codes, and generally anything that all programs need to document such that the outside world knows how to use them.

In the Unix world, these things were in "man" pages.  Java needs a standard, platform invariant, way to publish the same info via the JavaDoc mechanism.

I would be interested in following up with you to develop specific proposals, but at this point I simply wanted to register the need with you and see if any efforts were already in the works.