Tuesday, 8 June 2010

What's So Funny About ATDD, BDD, and Dependency Injection?

With apologies to Elvis Costello...

So, I've been thinking about a way to introduce the concepts of Acceptance Test Driven Development (ATDD) and Behavior Driven Development (BDD) to our developers. I'd also like to cover Dependency Injection as this seems to be something that comes up a lot when we discuss TDD in our little neck of the woods.

As an example, I've used an abstraction from a large project that we've been working on. The project is a data conversion project. I won't go into details, but the gist is this: take some data (probably from a database), perform some validation checks on it, filter stuff that we don't want, convert some stuff that we do want, and send it to some kind of output stream.

Our project is Java-based, but I've chosen to use Scala for this example for a number of reasons:
1. Scala seemed interesting, and I wanted to learn more.
2. Scala has parser combinators which is perfect to test my idea of using a DSL to define our conversion rules.
3. ScalaTest supports ATDD and BDD beautifully via its FeatureSpec and FlatSpec traits, respectively.
4. Scala has some neat ways of dealing with dependency injection (see Chapter 27 of the Odersky book).
5. Scala is fully interoperable with Java, so we can reuse any libraries or code from the existing project.

See Scala and ScalaTest here and here, respectively.

So, let's get started... (please bear in mind that I am well aware that my Scala code, being a relative newcomer to it, is likely to be a bit rough... I welcome suggestions!)

Firstly, I set up an ant build.xml that would compile my Scala code and run my ScalaTest tests. There are basically 3 files (not counting the build.xml): 1) test/UnitTests.scala, 2) test/AcceptanceTests.scala, and 3) src/Conversion.scala. It's a pretty small project for purposes of this exercise, so there's no need to get crazy here, plus it's nice to be able to easily ":load" files in the Scala interpreter (note: by the end it was just approaching large enough to warrant breaking some stuff out into more files, but I chose not to go there).

Note: the code for this project can be downloaded here. Please take a look. I am just going to focus this blog on a few of the more interesting points.

The application design was set up with the idea of ATDD in mind. There would be a Conversion class that would take all the necessary bits as parameters: those bits being the datasource, the rules for the conversions, and some kind of output stream device. So, let's look at some of the acceptance tests first.

The first thing to bear in mind was that I was going to need some kind of Data type for retrieving and carrying the input data. I knew that this might come from a database, but I wanted to be able to "inject" it for purposes of these tests. I decided to go with a simple Map[String,String] to capture field names and values. There would be an abstract class called "Data" and one called "DataSet" that I could implement to get my tests set up. The DataSet class would have a foreach method, so that rows could be read from a database one at a time for processing (rather than pass a whole set of data at once, as our datasets are quite large in reality).

Additionally, I would need "Rules". I figured that these would be read from a file, and would comprise our "DSL" bit, so I opted to have these be simple Strings. Again, an abstract class called RuleSet (List[String] for testing) would also have the potential for some other functionality, such as reading the Rules from a file.

I also used a static "Log" object to capture system output for monitoring. The default Log would just write to STDOUT, but I also implemented a version that would write to a list buffer, so that I could capture that stuff for testing, too.

Also, an Output class which I implemented as a ListBuffer as well for testing. In reality, this Output class could write to a database, or send data across the network, etc.

Lastly, a Conversion object which takes each of our little dependencies via its apply method, and does it's magic (basically apply all Rules to each Data object in the DataSet and capturing the output via our Output object).

Here's a snapshot of the acceptance tests...

 class RuleApiFeatureSpec extends FeatureSpec with GivenWhenThen with MustMatchers {  
  val d1 = MapData(Map(("id","1"),("a","5foo"),("b","1234bar234")))  
  val d2 = MapData(Map(("id","2"),("a","foo"),("b","bar45")))  
  val d3 = MapData(Map(("id","3"),("a","454foo45"),("b","445bar")))  
  val d4 = MapData(Map(("id","4"),("a","43foo45"),("b","bar87")))  
  val d5 = MapData(Map(("id","5"),("a","foo45"),("b","3bar4546")))  
  val d6 = MapData(Map(("id","6"),("a","cfoo"),("b","bar")))  
  object MapDataSet extends DataSet {  
  // in production, DataSet may come from a database (can be read one record at a time in foreach)  
  val data = List(d1,d2,d3,d4,d5,d6)  
    def foreach(f:(Data)=>Unit):Unit = { data.foreach(f) }  
  }  
  val r0 = "f:b /^[0-9][0-9]+.*/ Convert x:transformAandB" // convert a and b if b starts with two numbers  
  val r1 = "f:b /^[0-9]+.*/ Convert x:transformA" // convert a if b starts with a number  
  val r2 = "f:a /^[0-9]+.*/ Filter" // filter if a starts with a number  
  val r3 = "f:b /^b.*/ Convert x:transformB" // convert b if b starts with "b"  
  val r4 = "f:a /^c.*/ NoMatch" // bad rule  
  val rules = RuleList(List(r0,r1,r2,r3,r4),MyRuleParser)  
  class ListOut extends Output {  
    var container = new ListBuffer[Data]()  
    def apply(dataIn:Data) = {  
     container+dataIn  
    }  
  def getById(n:String):Data = { container.filter( (d) => d("id") == n )(0) } // assuming one match here  
    override def toString = "***OUTPUT***\n"+container.mkString("\n")  
  }  
  Log.setType("list")  
  feature("Conversion API"){  
  scenario("Filtered Output"){  
   given("The default testing scenario set up above")  
   when("I run the conversion")  
   val out = new ListOut  
   Conversion(MapDataSet,rules,out)  
   then("I expect to see five records in the output")  
   out.container must have length 5  
   then("The filtered record should be id 4")  
   out.container.map( d => d("id") ).contains("4") must be === false  
   then("The records not filtered would be 1,2,3,5,6")  
   out.container.map( d => d("id") ).sameElements(List("1","2","3","5","6")) must be === true  
  }  
  }  
 }  

So, this system is geared toward Acceptance Testing, via dependency injection. And ScalaTest's FeatureSpec is great for this. It even gives you a nice "Given When Then" syntax, so that your real world acceptance criteria can easily be translated into automated acceptance tests. You are writing acceptance criteria, aren't you!?!? ;-)

Other acceptance tests follow more or less the same pattern.

BTW, Here's the code for the Conversion object:

 object Conversion extends Logging {  
   def apply(data:DataSet, rules:RuleSet, output:Output) = {  
     data.foreach(  
       d => {  
         log("PRE PROCESSING: "+d)  
         val o = rules(d)  
         log("POST PROCESSING: "+o)  
         if(!o.isFiltered()) { log("ADDING TO OUTPUT: "+o); output(o); }  
       }  
     )  
   }  
 }  

The foreach method on the DataSet is called and the rules applied to each resulting object. So, by implementing a "DbDataSet", for example, these records could be pulled from a database.

I'll just show the RuleSet code, because I love this little Scala trick:

 case class RuleList(rs:List[String],rp:RuleParser) extends RuleSet with Logging {  
   val rules = rs.map { r => rp.parse(r) }.filter{ r => r != None }.map{ r => r.get }.toSeq  
   def apply(dataIn:Data):Data = {  
     log("APPLYING RULES")  
     Function.chain(rules)(dataIn)  
   }  
 }  

I suspect there might be a cleaner way to do that map/filter/map thing, but the cool thing is the line: Function.chain(rules)(dataIn). Basically, with the "rules" being implementations of Function[Data,Data], it's basically saying "apply this data object to this chain of Rule functions". Nice.

This line sets up the rules, and references the RuleParser:

 val rules = RuleList(List(r0,r1,r2,r3,r4),MyRuleParser)  

Here's the code for the RuleParser:

 object MyRuleParser extends RegexParsers with RuleParser {  
   def fieldStart:Parser[String] = "f:"  
   def field:Parser[String] = "[A-Za-z]+".r ^^ { _.toString }  
   def functionStart:Parser[String] = "x:"  
   def function:Parser[String] = "[A-Za-z]+".r ^^ { _.toString }  
   def pattern:Parser[Regex] = "/.*?/".r ^^ { case p => new Regex("/".r.replaceAllIn(p,"")) }  
   def action:Parser[String] = "Filter|Convert".r ^^ { _.toString }  
   def rule1:Parser[Rule] = fieldStart~field~pattern~action~functionStart~function ^^ { case fs~f~p~a~xs~x => Rule(new PatternMatcher(f,p),a,ConvertFuns.functions(x)) }  
   def rule2:Parser[Rule] = fieldStart~field~pattern~action ^^ { case fs~f~p~a => Rule(new PatternMatcher(f,p),a,NoConvert) }  
   def rule:Parser[Rule] = rule1 | rule2  
   def parse(input:String):Option[Rule] = parseAll(rule,input) match {  
     case Success(e,_) => Some(e)  
     case f: NoSuccess => None  
   }  
 }  

This uses parser combinators to turn our "Rule" (strings) into "Rule" objects. Rule itself looks like this:

 case class Rule(matcher:MatchFun, action:String, conversion:ConvertFun) extends Function[Data,Data] with Logging {  
   def apply(dataIn:Data):Data = {  
     log("RULE: "+this.action+" on data "+dataIn+":")  
  if(dataIn.isFiltered){ log("FILTERED"); dataIn }  
  else if(matcher(dataIn)){  
   action match {  
   case "Filter" => { log("FILTERING"); dataIn.setFiltered() }  
   case "Convert" => { log("CONVERTING"); conversion(dataIn) }  
   case _ => { log("BAD ACTION"); dataIn }  
   }  
  }  
  else{ log("NO MATCH"); dataIn }  
   }  
 }  

Now, I'd like to talk about unit tests.

For unit tests, I chose to use the FlatSpec trait in ScalaTest. The reason is because I wanted to do BDD, with the basic "Object when something should something in...{ code }" type syntax, but I really didn't like all the nesting that goes on in most libraries. Bill Venners has solved that problem for us with FlatSpec. The name is as it implies. Here are some examples...

 class MapDataSpec extends FlatSpec {  
  val m = MapData(Map(("id","1")))  
  "A MapData object with an 'id' field" should "have an 'id' key" in {  
  expect(true){ m.has("id") }  
  }  
  "A MapData object with an 'id' field with value '1'" should "return that value for that key" in {  
  expect("1"){ m("id") }  
  }  
  "A MapData object which has no 'a' field set" should "throw error when accessed" in {  
  expect(false){ m.has("a") }  
  intercept[NoSuchElementException]{ m("a") }  
  }  
  "A MapData object which has a field set" should "recognize that key and return that value for that key" in {  
  val m2 = m.set("a", "foo")  
  expect(true){ m2.has("a") }  
  expect("foo"){ m2("a") }  
  }  
  "A MapData object which has not had filtered set" should "return false for isFiltered" in {  
  expect(false){ m.isFiltered }  
  }  
  "A MapData object which has filtered set" should "return true for isFiltered" in {  
  val m2 = m.setFiltered  
  expect(true){ m2.isFiltered }  
  }  
 }  

And the unit tests continue on like that (please see linked code for details). Nice, simple syntax. Very intuitive.

I hope this makes some sense, and proves helpful to some.

Edit: BTW, I have found a slightly more elegant way of doing this:

 val rules = rs.map { r => rp.parse(r) }.filter{ r => r != None }.map{ r => r.get }.toSeq  

which is this:

 val rules = rs.map { r => rp.parse(r) }.flatMap{ x=>x }.toSeq  

No comments:

Post a Comment