A Smackdown Tool for Overeager TDDers

by Jeff Langr

June 27, 2012

smackdown!

I’ve always prefaced my first test-driven development (TDD) exercises by saying something like, “Make sure you write no more code than necessary to pass your test. Don’t put in data structures you don’t need, for example.” This pleading typically comes on the tail of a short demo where I’ve mentioned the word incremental numerous times.

But most people don’t listen well, and do instead what they’ve been habituated to do.

With students in shu mode, it’s ok for instructors to be emphatic and dogmatic, smacking students upside the head when they break the rules for an exercise. It’s impossible to properly learn TDD if you don’t follow the sufficiency rule, whether deliberately or not. Trouble is, it’s tough for me to smack the heads of a half-dozen pairs all at once, and some people tend to call in HR when you hit them.

The whole issue of incrementalism is such an important concept that I’ve introduced a new starting exercise to provide me with one more opportunity to push the idea. The natural tendency of students to jump to an end solution is one of the harder habits to break (and a frequent cause of students’ negative first reaction when they actually try TDD).

I present a meaty first example (latest: the Soundex algorithm) where all the tests are marked as ignored or disabled, a great idea I learned from James Grenning. In Java, the students are to un-@Ignore tests one-by-one, simply getting them to pass, until they’ve gotten all tests to pass. The few required instructions are in the test file, meaning they can be hitting this exercise about two minutes after class begins.

Problem is, students have a hard time not breaking rules, and always tend to implement too much. As I walk around, I catch them, but it’s often a little too late. Telling them that they need to scrap their code and back up isn’t what they want to hear.

So, I built a custom test-runner that will instead fail their tests if they code too much, acting as a virtual head-smacking Jeff. (I built a similar tool for C++ that I’ve used successfully in a couple C++ classes.)

Here’s the (hastily built) code:

import org.junit.*;
import org.junit.internal.*;
import org.junit.internal.runners.model.*;
import org.junit.runner.*;
import org.junit.runner.notification.*;
import org.junit.runners.*;
import org.junit.runners.model.*;

public class IncrementalRunner extends BlockJUnit4ClassRunner {
   public IncrementalRunner(Class klass) 
         throws InitializationError {
      super(klass);
   }

   @Override
   protected void runChild(
         FrameworkMethod method, RunNotifier notifier) {
      EachTestNotifier eachNotifier = 
         derivedMakeNotifier(method, notifier);
      if (method.getAnnotation(Ignore.class) != null) {
         runIgnoredTest(method, eachNotifier);
         return;
      }

      eachNotifier.fireTestStarted();
      try {
         methodBlock(method).evaluate();
      } catch (AssumptionViolatedException e) {
         eachNotifier.addFailedAssumption(e);
      } catch (Throwable e) {
         eachNotifier.addFailure(e);
      } finally {
         eachNotifier.fireTestFinished();
      }
   }

   private void runIgnoredTest(
         FrameworkMethod method, EachTestNotifier eachNotifier) {
      eachNotifier.fireTestStarted();
      runExpectingFailure(method, eachNotifier);
      eachNotifier.fireTestFinished();
   }

   private EachTestNotifier derivedMakeNotifier(
         FrameworkMethod method, RunNotifier notifier) {
      Description description = describeChild(method);
      return new EachTestNotifier(notifier, description);
   }

   private void runExpectingFailure(
         final FrameworkMethod method, EachTestNotifier notifier) {
      if (runsSuccessfully(method)) 
         notifier.addFailure(
            new RuntimeException("You've built too much, causing " + 
                                 "this ignored test to pass."));
   }

   private boolean runsSuccessfully(final FrameworkMethod method) {
      try {
         methodBlock(method).evaluate();
         return true;
      } catch (Throwable e) {
         return false;
      }
   }
}

(Note: this code is written for JUnit 4.5 due to client version constraints.)

All the custom runner does is run tests that were previously @Ignored, and expect them to fail. (I think I was forced into completely overriding runChild to add my behavior in runIgnoredTest, but I could be wrong. Please let me know if you’re aware of a simpler way.) To use the runner, you simply annotate your test class with @RunWith(IncrementalRunner.class).

To effectively use the tool, you must provide students with a complete set of tests that supply a definitive means of incrementally building a solution. For any given test, there must be a possible implementation that doesn’t cause any later test to pass. It took me a couple tries to create a good sequence for the Soundex solution.

The tool is neither foolish-proof nor clever-proof; a small bit of monkeying about and a willingness to deliberately cheat will get around it quite easily. (There are probably a half-dozen ways to defeat the mechanism: For example, students could un-ignore tests prematurely, or they could simply turn off the custom test-runner.) But as long as they are not devious, the test failure from building too much gets in their face and smacks them when I’m not be around.

If you choose to try this technique, please drop me a line and let me know how it went!

Comments

Pingback: My First TDD Exercise | langrsoft.com

Pingback: Langr Software Solutions » More on That First TDD Exercise

Share your comment

Jeff Langr

About the Author

Jeff Langr has been building software for 40 years and writing about it heavily for 20. You can find out more about Jeff, learn from the many helpful articles and books he's written, or read one of his 1000+ combined blog (including Agile in a Flash) and public posts.