Extending Ant’s JUnit results format

Một phần của tài liệu Manning JUnit recipes practical methods for program (Trang 239 - 255)

7.6 Extending Ant’s JUnit results format

Problem

You are running JUnit with Ant and you need to customize the results format to add more information or adhere to a specialized format.

3 In fact, why don’t the Ant folks ditch the <report> task and just use the <style> task, at least under the covers?

209 Extending Ant’s JUnit results format

Background

We have seen a situation in which a legacy test results management system, origi- nally developed without support for Ant or JUnit, needed to be outfitted with sup- port for test results produced from JUnit, which was being run by Ant. One of the requirements was to make Ant’s XML output of JUnit results conform to the input file format of the repository. The XML files could then be analyzed and reported on by the results management system without knowing their origin. The results repository took an XML input file that looked similar to the XML formatted results which Ant’s <junit> task can output when using a nested <formatter type="xml">

element. A good solution to the problem was to extend and customize the XML results format of Ant’s XML formatter.

Another situation in which you might want to customize Ant’s JUnit results for- mat would be if you wanted your results to be in PostScript, PDF, or HTML format.

You can output the desired format directly without producing intermediate XML results files that need to be processed by XSL.

Recipe

1 Implement the interface JunitResultFormatter found in the package org.

apache.tools.ant.taskdefs.optional.junit.

2 Specify the name of the custom formatter class in the classname attribute of the <junitreport> task in your build script.

Listing 7.8 shows one way to implement these steps in a custom results formatter that outputs reports in HTML format. Note that this class depends on Ant tools, so you need ant.jar and ant-junit.jar, which are both part of the Ant distribution, to compile it.

package junit.cookbook.reporting.ant;

import java.io.*;

import java.text.NumberFormat;

import java.util.Hashtable;

import junit.framework.*;

import org.apache.tools.ant.BuildException;

import org.apache.tools.ant.taskdefs.optional.junit.*;

public class HtmlJUnitResultFormatter implements JUnitResultFormatter { /** Formatter for timings. */

private NumberFormat nf = NumberFormat.getInstance();

Listing 7.8 HtmlJUnitResultFormatter

210 CHAPTER 7

Reporting JUnit results

/** Timing helper. */

private Hashtable testStarts = new Hashtable();

/** Where to write the log to. */

private OutputStream out;

/** Helper to store intermediate output. */

private StringWriter middle;

/** Convenience layer on top of {@link #middle middle}. */

private PrintWriter wri;

/** Suppress endTest if testcase failed. */

private Hashtable failed = new Hashtable();

private String systemOutput = null;

private String systemError = null;

public void setOutput(OutputStream out) { this.out = out;

}

public void setSystemOutput(String out) { systemOutput = out;

}

public void setSystemError(String err) { systemError = err;

}

public HtmlJUnitResultFormatter() { middle = new StringWriter();

wri = new PrintWriter(middle);

} /**

* The whole testsuite ended.

*/

public void endTestSuite(JUnitTest suite) throws BuildException { String nl = System.getProperty("line.separator");

StringBuffer header = new StringBuffer(

"<html>"

+ nl

+ "<head><title>JUnit Results</title></head>"

+ nl + "<body>"

+ nl + "<table border=\"1\">" + nl);

header.append(

"<tr><th>Suite: "

+ suite.getName()

+ "</th><th>Time</th></tr>" + nl);

StringBuffer footer = new StringBuffer();

footer.append(nl + "<tr><td>");

footer.append("Tests run:");

footer.append("</td><td>");

TE AM FL Y

Team-Fly®

211 Extending Ant’s JUnit results format

footer.append(suite.runCount());

footer.append("</td></tr>" + nl + "<tr><td>");

footer.append("Failures:");

footer.append("</td><td>");

footer.append(suite.failureCount());

footer.append("</td></tr>" + nl + "<tr><td>");

footer.append("Errors:");

footer.append("</td><td>");

footer.append(suite.errorCount());

footer.append("</td></tr>" + nl + "<tr><td>");

footer.append("Time elapsed:");

footer.append("</td><td>");

footer.append(nf.format(suite.getRunTime() / 1000.0));

footer.append(" sec");

footer.append("</td></tr>");

footer.append(nl);

// append both the output and error streams to the log if (systemOutput != null && systemOutput.length() > 0) { footer

.append("<tr><td>Standard Output</td><td>") .append("<pre>")

.append(systemOutput) .append("</pre></td></tr>");

}

if (systemError != null && systemError.length() > 0) { footer

.append("<tr><td>Standard Error</td><td>") .append("<pre>")

.append(systemError)

.append("</pre></td></tr>");

}

footer.append("</table>" + nl + "</body>" + nl + "</html>");

if (out != null) { try {

out.write(header.toString().getBytes());

out.write(middle.toString().getBytes());

out.write(footer.toString().getBytes());

wri.close();

out.flush();

} catch (IOException ioe) {

throw new BuildException("Unable to write output", ioe);

} finally {

if (out != System.out && out != System.err) { try {

out.close();

} catch (IOException e) { }

}

212 CHAPTER 7

Reporting JUnit results

} } } /**

* From interface TestListener.

* <p>A new Test is started.

*/

public void startTest(Test test) {

testStarts.put(test, new Long(System.currentTimeMillis()));

failed.put(test, Boolean.FALSE);

wri.print("<tr><td>");

wri.print(JUnitVersionHelper.getTestCaseName(test));

wri.print("</td>");

} /**

* From interface TestListener.

* <p>A Test is finished.

*/

public void endTest(Test test) { synchronized (wri) {

if (Boolean.TRUE.equals(failed.get(test))) { return;

}

Long secondsAsLong = (Long) testStarts.get(test);

double seconds = 0;

// can be null if an error occured in setUp if (secondsAsLong != null) {

seconds = (System.currentTimeMillis() - secondsAsLong.longValue()) / 1000.0;

}

wri.print("<td>");

wri.print(nf.format(seconds));

wri.print(" sec</td></tr>");

} } /**

* Interface TestListener for JUnit > 3.4.

*

* <p>A Test failed.

*/

public void addFailure(Test test, AssertionFailedError t) { formatThrowable("failure", test, (Throwable) t);

} /**

* Interface TestListener.

*

* <p>An error occured while running the test.

*/

213 Extending Ant’s JUnit results format

public void addError(Test test, Throwable t) { formatThrowable("error", test, t);

}

private void formatThrowable(String type, Test test, Throwable t) { synchronized (wri) {

if (test != null) {

failed.put(test, Boolean.TRUE);

endTest(test);

}

wri.println("<td><pre>");

wri.println(t.getMessage());

// filter the stack trace to squelch Ant and JUnit stack // frames in the report

String strace = JUnitTestRunner.getFilteredTrace(t);

wri.print(strace);

wri.println("</pre></td></tr>");

} } /**

* From interface JUnitResultFormatter. We do nothing with this * method, but we have to implement all the interface’s methods.

*/

public void startTestSuite(JUnitTest suite) throws BuildException { }

}

Although this looks like an awful lot of code, the general idea is straightforward.

At each stage of executing the test suite, Ant generates various events: one when the test suite starts executing, one when it ends, one for each test, and one for each test failure or error. For each of these events, we have provided an event han- dler that outputs HTML corresponding to each event.

For the “start test suite” event, there is nothing to do. If we wanted to add some kind of test suite header, we would have added that here. For the “start test” event, we start an HTML table row and write out the name of the test. What we write out next depends on how the test ends. If the test fails, we treat the assertion failure as a Throwable object (it is an AssertionFailedError, after all) and print the stack trace as preformatted text in a <pre> tag. This is the same behavior we use when the test ends with an error, due to throwing an unexpected exception. Finally, for the “end test suite” event, we write out a summary of the test run, with failure and error counts as well as any text written to the standard output and error streams.

This is a pretty comprehensive report!

214 CHAPTER 7

Reporting JUnit results

Here is an Ant target that can be used in an Ant build file for running tests and reporting results using our custom formatter:

<target name="ant-custom-formatter"

description="-> demos custom Ant results formatter">

<mkdir dir="${custom.reports.dir}"/>

<junit printsummary="yes" haltonfailure="no">

<classpath>

<pathelement location="${classes.dir}"/>

<pathelement path="${java.class.path}"/>

</classpath>

<batchtest fork="yes" todir="${custom.reports.dir}">

<formatter classname=

"junit.cookbook.reporting.ant.HtmlJUnitResultFormatter"

extension=".html"

usefile="true"/>

<fileset dir="${src.dir}">

<include name="**/tests/runner/AllTests.java"/>

</fileset>

</batchtest>

</junit>

</target>

Discussion

One limitation of our HtmlJUnitResultFormatter example is that it outputs one HTML file per test case class that executes. So while it is fine for reporting results for a few medium to large test suites, which will produce a few short- to medium- length HTML reports files, it becomes unusable when dealing with dozens or hun- dreds of test case classes.

This recipe could be enhanced to produce HTML frames documents to orga- nize and link the individual HTML reports together. You should also be able to easily see how to write your own custom XML output formatter from this exam- ple—just use XML tags instead of HTML. Also, see Ant’s own XMLJUnitResult- Formatter for inspiration.

Generally, implementing custom reports formats comes down to a choice between writing Java extensions of Ant’s JUnit APIs, as we do in this recipe, or writ- ing new or customized XSL stylesheets for the <junitreport> task. The approach you choose often depends on which technology better suits your reporting require- ments and the skill set of your team.

Related

■ 7.3—Getting plain text results with Ant

■ 7.5—Customizing <junit> XML reports with XSLT

215 Implementing TestListener

and extending TestRunner

7.7 Implementing TestListener and extending TestRunner

Problem

You want total control of the format of JUnit’s test results reporting.

Background

A common question on the JUnit Yahoo! group is how to customize JUnit’s test results reporting. The default reporting of the text-based test runner is pretty bare bones (“.” for pass, “E” for error, “F” for failure). The Swing-based and AWT-based runners display similar results in an interactive GUI.

For getting HTML or XML results files out of your JUnit test runs, the most com- mon practice is to use Ant’s <junit> and <junitreport> tasks to execute the tests and report the results. But we have come across cases where Ant is not or cannot be used or where Ant might be more of a hassle to work with than simply extend- ing the JUnit framework. (If your only problem with Ant is that it does not sup- port your target XML or HTML reporting format, first see recipe 7.6, and see if that’s enough to solve your problem). In cases such as these, you can extend JUnit to format and output results any way you want by using APIs in the JUnit framework.

Recipe

Implement junit.framework.TestListener to define the results format and output mechanism, and then extend junit.runner.TestRunner to “register” the listener with the test runner. We’ll go through the process in steps.

NOTE Observer/ObservableIn this context a Listener, as in TestListener, is one of the participants of an implementation of the Observer pattern as captured in the so-called “Gang of Four” (Erich Gamma, Richard Helm, Ralph Johnson, John Vlissides) book Design Patterns.

In short, a listener, also known as an observer or subscriber, is an object that attaches or registers itself with another object (the “observable”) in order to receive updates or notifications when the observable changes state.

The observable is also known as the “subject” or “publisher.” Each publisher can have many subscribers. In our recipe here, the CookbookTestRunner is the publisher for the CookbookTestListener. But TestRunners can also be listeners (and implement the TestListener interface), registering themselves with test results in order to handle the display or routing of test events themselves. To see some source code examples of runners that are listeners, see junit.awtui.TestRunner.runSuite() and junit.swin- gui.TestRunner.doRunTest(Test testSuite).

216 CHAPTER 7

Reporting JUnit results

First, implement the TestListener interface to define the output format for results. Listing 7.9 shows the interface to implement in order to control results output for a test runner. Note that all the methods accept an object implementing junit.framework.Test (either a TestCase or a TestSuite) as a parameter.

package junit.framework;

/**

* A Listener for test progress */

public interface TestListener { /**

* An error occurred.

*/

public void addError(Test test, Throwable t);

/**

* A failure occurred.

*/

public void addFailure(Test test, AssertionFailedError t);

/**

* A test started.

*/

public void startTest(Test test);

/**

* A test ended.

*/

public void endTest(Test test);

}

Each test has the potential to produce one or more failures or errors as JUnit exe- cutes it. The TestListener is notified by a failure or error event if a test fails or has an error. The TestListener is not notified of successes so we can assume that any tests that start and end without an error or failure succeeded. Note that the lis- tener also receives notification of start and end events. These events give us a good place to decorate and format each test and its results as it executes and noti- fies the test listener. Something that some people find strangely absent are start- Suite() and endSuite() events and methods, since the API seems incomplete without them. But we can add these methods to our TestListeners and could even extend the TestListener interface with our own interface that required

Listing 7.9 junit.framework.TestListener

t is the unexpected Throwable causing the error

t is the AssertionFailedError representing the failure

Notification of a test starting

Notification of a test ending

217 Implementing TestListener

and extending TestRunner

extra methods. In fact, this is what the Ant JUnit reporting tasks do, as we see else- where in this chapter.

For demonstration purposes we just need a simple test listener implementation that does something interesting. So we will write a listener that is capable of writ- ing out test results in a simple XML format. The test listener is responsible for for- matting the results of each test. The results are stored in XML as we build the document in memory using the org.w3c.domAPI, and provide a print() method that serializes the XML to an output stream, and a getXmlAsString() method that returns the XML as a String. Listing 7.10 shows CookbookTestListener, our Test- Listener implementation.

package junit.cookbook.reporting;

import java.io.PrintStream;

import java.io.StringWriter;

import java.text.NumberFormat;

import javax.xml.parsers.*;

import javax.xml.transform.*;

import javax.xml.transform.dom.DOMSource;

import javax.xml.transform.stream.StreamResult;

import junit.framework.*;

import junit.runner.BaseTestRunner;

import org.w3c.dom.*;

public class CookbookTestListener implements TestListener { PrintStream printStream;

Document xmlOutput;

Element xmlRoot;

Element testCase;

int errorCount = 0;

int failureCount = 0;

int testCaseCount = 0;

/**

* Default constructor creates a CookbookTestListener that streams * results to System.out.

*

* @throws ParserConfigurationException */

public CookbookTestListener() throws ParserConfigurationException { this(System.out);

} /**

Listing 7.10 CookbookTestListener

218 CHAPTER 7

Reporting JUnit results

* Creates a new CookbookTestListener that captures results in an XML * Document and serializes the XML to the specified

* <code>printStream</code>

*

* @param printStream to use for serializing XML results * @throws ParserConfigurationException

*/

public CookbookTestListener(PrintStream printStream) throws ParserConfigurationException {

DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();

DocumentBuilder builder = factory.newDocumentBuilder();

xmlOutput = builder.newDocument();

this.printStream = printStream;

}

public void startSuite(Test suite) { xmlRoot = (Element) xmlOutput.createElement("testsuite");

xmlRoot.setAttribute("class",suite.toString());

xmlOutput.appendChild(xmlRoot);

} public void addError(Test test, Throwable t) { errorCount++;

Element error = (Element) xmlOutput.createElement("error");

addThrowable(t, error);

} public void addFailure(Test test, AssertionFailedError t) { failureCount++;

Element failure = (Element) xmlOutput.createElement("failure");

addThrowable(t, failure);

} public void startTest(Test test) { testCase = (Element) xmlOutput.createElement("test");

String methodStr = ((TestCase) test).getName();

testCase.setAttribute("name", methodStr);

xmlRoot.appendChild(testCase);

} public void endTest(Test test) { testCaseCount = testCaseCount + test.countTestCases();

} public void print() throws TransformerException { Transformer transformer = getTransformer();

DOMSource source = new DOMSource(xmlOutput);

StreamResult streamResult = new StreamResult(printStream);

transformer.transform(source, streamResult);

}

/**

* @return output of the test results as a String */

B

C

D

E

F G

219 Implementing TestListener

and extending TestRunner

public String getXmlAsString() throws TransformerException { Transformer transformer = getTransformer();

DOMSource source = new DOMSource(xmlOutput);

StringWriter xmlString = new StringWriter();

StreamResult streamResult = new StreamResult(xmlString);

transformer.transform(source, streamResult);

return xmlString.toString();

} public void endSuite(TestResult testResult, long runTime) { Element summary = (Element) xmlOutput.createElement("summary");

Element tests = (Element) xmlOutput.createElement("tests");

Element errors = (Element) xmlOutput.createElement("errors");

Element failures = (Element) xmlOutput.createElement("failures");

Element runtime = (Element) xmlOutput.createElement("runtime");

String testCount = String.valueOf(testResult.runCount());

String errCount = String.valueOf(testResult.errorCount());

String failCount = String.valueOf(testResult.failureCount());

String runTimeStr =

NumberFormat.getInstance().format((double) runTime / 1000);

tests.appendChild(xmlOutput.createTextNode(testCount));

errors.appendChild(xmlOutput.createTextNode(errCount));

failures.appendChild(xmlOutput.createTextNode(failCount));

runtime.appendChild(xmlOutput.createTextNode(runTimeStr));

xmlRoot.appendChild(summary);

summary.appendChild(tests);

summary.appendChild(errors);

summary.appendChild(failures);

summary.appendChild(runtime);

}

private void addThrowable(Throwable t, Element elem) { String trace = BaseTestRunner.getFilteredTrace(t);

elem.setAttribute("message", t.getMessage());

elem.appendChild(xmlOutput.createCDATASection(trace));

testCase.appendChild(elem);

}

private Transformer getTransformer() throws TransformerException { TransformerFactory tFactory = TransformerFactory.newInstance();

Transformer transformer = tFactory.newTransformer();

transformer.setOutputProperty(

javax.xml.transform.OutputKeys.INDENT, "yes");

transformer.setOutputProperty(

javax.xml.transform.OutputKeys.STANDALONE, "yes");

return transformer;

} }

H

I

220 CHAPTER 7

Reporting JUnit results

Report that a new test suite is executing. This creates an XML element that looks like <testsuite class="com.mycom.test.MyTestSuite">.

Report an error, complete with its message and a stack trace of the corresponding unexpected exception.

Report a failure, complete with its message and a stack trace of the corresponding AssertionFailedError.

Report that an individual test is starting to execute. This creates an XML element that looks like <test name="testMyTestName">.

Note that an individual test has completed, incrementing the running total of executed tests. For a TestCase object, countTestCases() always returns 1.

Write the XML document we are creating to the TestListener’s PrintStream using the identity transform.4

This method is useful during testing, or whenever you might want to see the XML document we are creating as a String.

Report the end of a test suite, including a summary of the test results.

Now we have implemented the listener methods that will receive callbacks from the test runner as tests are executed and test methods start, end, or have an error of failure. In each callback method we used DOMAPIs to format the test class, test method names, failures, errors, and results as XML. The second thing we must do is to extend TestRunner so we can tell it to use our TestListener implementation for reporting results. To do that, we have to implement three methods: main() for executing the runner on the command line, processArgs() for handling com- mand-line arguments, and doRun() to register our listener with the test runner. Then we can kick off the test run, and call the listener’s print() method. Listing 7.11 shows our custom test runner.

package junit.cookbook.reporting;

import java.io.FileNotFoundException;

import java.io.FileOutputStream;

import java.io.PrintStream;

import javax.xml.parsers.ParserConfigurationException;

import javax.xml.transform.TransformerException;

4 The identity transform is an XSL transformation that applies an identity template to each XML ele- ment. The result is output identical to the input: a copy of the input XML document.

Listing 7.11 CookbookTestRunner, an extension of TestRunner

B C D E F G H

I TE AM FL Y

Team-Fly®

Một phần của tài liệu Manning JUnit recipes practical methods for program (Trang 239 - 255)

Tải bản đầy đủ (PDF)

(753 trang)