Within the context of testing

This post  is written for .Net, using C#, but the methods in use should be fairly easy to implement in most other computer languages.



I’ve often looked for a good way to write reusable testing scenarios that were clean to use, easy to set up and most importantly – easy to read six months afterwards.

Looking at one of Ayende’s blog posts, where he discusses  scenario driven tests, I found something lacking: Whilst he has a great thing going for testing scenarios against a set context, he has limited himself to a very constrained situation, something that I needed to expand to suit my needs. The issue is that having a context that represents “the system” is not enough – I want to be able to specify exactly the state that my system is in when I execute a scenario, with the ambition of being able to re-use the various situations.


Situations and Settings

I wanted to be able to change contexts with ease, so that my test scenarios ultimately look something like:

public class EventMonitorTests : TestBase
    public void Given_When_Then( )
        using( DdTestContext testContext = new DdTestContext( true ) )
            // Prepare
            int initialCount = testContext.Clients.TotalResponseCount( );
            PrepareSituation< RegisterThreeClients >( testContext );

            // Invoke                
            ExecuteScenario< RespondToAllClients >( testContext );

            int actualCount = testContext.Clients.TotalResponseCount( );

            // Assert
            actualCount.ShouldBeExactly( initialCount + 3 );


The idea is to introduce an interface ISituation as well as IScenario that are known to my test base-class. I can then invoke a Situation and Scenario in various combinations, so that I can test behaviors under different circumstances without having to repeat myself.

ISituation has a Prepare statement, while an IAction  has an Execute method, both accepting the testContext object:

public interface ISituation
    void Prepare( DdTestContext testContext );

public interface IScenario
    void Execute( DdTestContext context );

I gave it some thought as to whether I needed this separation or not, but concluded that having a clean separation between rigging a situation and executing a scenario better helps the reader to understand the tests.

Additionally, I needed to declare the methods within the base class that are responsible for executing PrepareSituation and ExecuteScenario:

public void PrepareSituation< T >( DdTestContext testContext ) where T : ISituation, new( )
    T situation = new T( );
    situation.Prepare( testContext );

public void PrepareScenario< T >( DdTestContext testContext ) where T : IScenario, new( )
    T scenario = new T( );
    scenario.Execute( testContext );

The functions are limited to accepting only objects that honor the respective interfaces.

This takes care of the execution, as you can guess,For the sake of being practical, whenever I am writing a new batch of scenarios that I would like to test, I often end up writing most ISituation implementations within one single Situations.cs file as well as gathering the IScenario implementations in a corresponding Scenarios.cs file. This is purely to keep the aspects together, when they are simple.



This depends largely on the complexity of the situations and scenarios. For rigging up more complex scenarios, I would normally use a sub-folder within my solition explorer, and name each file after it’s class name, as is normal.


The Context class

Initially, I wanted a nice, controlled way to control transactions for my tests, as they often interact with database. The class DdTestContext represents the system under test as a whole, and can thus contain some sort of transaction object that is configured for the database being tested.


public class DdTestContext : IDisposable
    private Transaction _transaction;

    public DdTestContext( bool useTransaction )
        if( useTransaction )
            _transaction = new Transaction( );
            _transaction.Begin( );

    public void Dispose( )
        if( _transaction == null )

Thus, with this construct, I now have a somewhat stable way of executing my tests within a transactional context without too much effort:

public void Given_When_Then( )
    using( DdTestContext testContext = new DdTestContext( true ) )
        // Prepare

        // Invoke

        // Assert
        Assert.Inconclusive( "Not yet implemented" );

This instantly became a code snippet 🙂

Every test that executes within the using clause will initiate a transaction, and roll it back again once the scope is out (or unhandled exceptions occur). My database is safe, for now.

Whenever I don’t need database support (Unit tests), I can simply opt for false as the constructor argument and no transaction will be initiated for that particular context.

I dragged the using  statement into each test as opposed to initializing the testcontext in a pre/post method in order to control the use of transactions individually for each test.


Still to come

  • How I expanded the testcontext with useful functions
  • Using extension methods extensively

Comments are very welcome 🙂

Painful correctness

Catching up

Back to work again to my new employer, this time (back) in the role as an IT consultant and a senior one at that. It is refreshing to work in an environment that is constantly challenging my knowledge about various technologies. Now if I only could figure out how to get used to having a laptop computer with me back and forth every single day and I’ll be one happy camper..

Away, fatness!

Managing to work out in the morning without sweating down my clothes afterward is a challenge. As spinning classes now are back to their normal 3-day per week schedule, I need to figure out how to stop sweating after the shower!

Look sharp! Look dry!

A thing that is bugging me, is that it really isn’t easy bringing a gym bag, a laptop backpack AND dress nicely all on top of my mountain bike on the way to the train station. This is getting to the point where I am seriously considering taking the car to the train station just to avoid the hassle. Clearly, being environmentally aware is not compatible with being a commuter – not in the consultant business!

Rough waters

The basement is now dry after the flood now and ready for a fresh coat of paint. We still haven’t moved anything down there, as we are hoping to do something with the roof while we’re going to paint/lay new flooring there.

Chip! chip! chip! chip!

Photography has been dead the last month or so. With the new job taking up all my available brain power, and Stargate SG-1 taking up the spare time, I am saving most of my creative energy for work, but I still have managed to push forward a project for controlling my Pre-Amplifier. Expect a dedicated home page soon. I had a similar project far too many years ago, written in C++ using MFC and virtually no knowledge about agile software development. The new version is being written in C# using a 3-tiered model, complete with a separate unit testing project, installer, user documentation, and scrum as the project management methodology. The thing is practically writing itself!

Tell me about myself!

Oh, I woke up this morning thinking that I have really never written an “about-me” page in spite of having had a home page on the internet since around 1995! Of course, you cannot wake up a Sunday morning thinking a thing like that without immediately scratching your groin, jumping out of bed, and writing one, so.. I did, and here is the result.

One Saturday morning…

I spent the better part of this saturday morning fixing up an application that will let a slacker friend of mine export his images to his apache-based web server. At the same time, I figured I needed a wee bit shorter URL’s to my own images (if you’ve ever tried to look at the image url, you’d probably be scared!), so I wrote a small wrapper that is able to fetch the correct image using the filename of the original without disclosing the RAW image that lies behind. Pretty cool, innit?


This lets me insert images like this without a link that takes up 5 lines of space, AND they now include an embedded watermark! Go ahead! right-click the image,check it’s properties, and look at the sexah url!

So tell me, what did YOU do this morning? 🙂

Galleries and more

Having found myself with some extra time to spend on developing useless, but fun projects at home, I came across this little plugin for FireFox named PicLens. It is truly the most amazing way to display images on the web that I’ve seen so far, so I instantly implemented a WebService to speak with my FotoWeb server and return Media RSS. I wonder why I have not done this before..

Anyway, I did not want to expose any of my internal FotoWeb details, such as credentials, archives, etc to the outside world, so I wrote a web application that communicates with my webservice, and “finalized” the RSS stream there. In a matter of little over 2 hours, I now have a zero-to-hero  solution where the feed for my brand new PicLens gallery is supplied by a web application that in turn gets it’s juice from a web service that speaks with my FotoWeb.

My virtual chest just grew 3 new thick hairs!

BOINC! ET knows your pin-code!

I’m what you can call a skeptic when it comes to the search for extra-terrestrial intelligence, but an open minded one at that. I only recently installed the new boinc manager, made by the berkeley staff, the new client that allows you to donate your computer’s spare time (the time that it’s would otherwise be running a screen saver).

The SETI project, as it is called, receives a large amount of data from space, splits this data into tiny packages that are sent out to believers and “donors” for analysis. Hundreds of thousands, maybe even millions of computers are donating a tiny bit of their time, and together they form a super computer with insane number crunching powers! Number crunching! Where’s that useful…
Have you ever heard of rainbow tables? No? Guess what, you’re in for the surprise of your life. First follow the link  on the rainbow tables to get an idea of what it is, then think about the seti project. Imagine that type of number crunching power, used to un-hash every possible combination of codes that you could ever dream up from a commom function, such as MD5.. That would mean that if I got hold of your creditcard, and knew what type of algorithm your bank uses to encode your pin-code, I can suddenly just read the cards hash code, and look it up in my rainbow table!

Good thing I don’t have the computing power to calculate that table, right? It would literally take thousands of computers like my laptop to perform the calculations within my lifetime.. hmm.. oh, hang on.. what’s this?? Follow this link here, people. I rest my case.

The tables are huge in size, around 450 GB for a single table is quite a bit of data, or at least, it used to be only 4 years ago. Today, 450GB costs “nothing”. Terabyte storage capacity is just around the corner to you and me. I can buy a 750GB harddrive today, if I wanted to, and there is nobody to stop me from doing it. With distributed download, I can easilly and successfully download that mother of all tables within a week or so, and I am then ready to be a pirate. All thanks to BOINC.
Bergfrid know that her credit cards were hacked, she wasn’t phished or phracked as the bank claims. Her cards were stolen, her credit card was read in a pocket reader, and then someone did a table lookup to get the pin codes that correspond to the hash there.

Any person with just the very very basic piece of knowledge about software developement can tell you that an MD5 is no longer secure when the ability to just look up the hash in a table is given to him. He no longer needs to “guess” the proper hash, he can just look it up.

Banks around the world – wake up. BOINC made it possible for hackers to unite their computing power and to perform calculations that will thwart the security of VISA cards. I want my 8-digit pin code NOW!

Vista – completely useless, yet, extremely addictive

I’ve had the good fortune of being able to install windows Vista on my laptop, to experience the look and feel of this new operating system. Was it new enough to be worth the upgrade?

Let me start by telling you that Vista does not introduce anything that I’d call a “must-have” feature. If you’re used to windows XP, and don’t want to upgrade to Vista, you don’t have to either. There’s nothing there for you that you can’t do without, and with so many years of patches, XP has become one of the better OS’es out there in my view.

BUT, Vista gives you alot of candy! And I’m not talking about Aero, I had to switch that off in order to keep my laptop speed up, it wouldn’t run google earth smoothly at the same time the Aero thing was activated. Shame, it looks smart, but not that smart. What I really like in vista, is how everything’s organized. It has become so heavily search-oriented that you’re about to start thinking SOOS instead of DOS (Search Oriented Operating System as opposed to Disk Oriented System). The introduction of a performance index is pure genious. You need to know if a game will run on your computer? just look at the number. Mine’s a “3.6”, which means, all the games I buy are expected to run smoothly as long as they don’t require more than that.

There’s too much to write about Vista for this blog spot, but my preliminary verdict is this: I’d never spend a dime purchasing an upgrade for my computers. For that, I have no reason, since XP does what I need it to do, but darned, it looks and feels VERY nice. Let’s see how long it takes before the Linux geeks have a copy of it :p

V for violet

From time to time, I get the urge to sit down and holler at the world and stir up a cascade of accusations and complaints about something that is less than trivial to begin with. Now is one of those times, I’m about to take upon me the role of the pixel peeper, never satisfied with less than perfection. It turns out that my Nikon D200 is completely and utterly unable to display violets! Someone, with a great deal of spare time on his hands, and nothing else to spend it on, have found that if you take your every-day primary school prism, and arrange it so that it will show you the colors of the rainbow, and then take a photography of said phenomenon, the violets will be completely gone – missing, astray, you name it, they won’t be there. Instead, a slight blue-ish shade will be there in it’s stead. Additionally, colors that are seamless in their transition between primaries (red, green, blue), are amazingly not so at all once shot onto digital chip. These facts, in the proper forums, spawn such large debates and outbreaks of sheer panic that I often wonder why I frequent those places to begin with. New looks for Bergfrid
Sadly, they’re also the best spots to be among fellow digital photographers that have a tendency for taking more images than the yearly family portrait in front of the Christmas tree.

Speaking of which, Bergfrid had her hair done the other day, and came home looking so fine that I had to rig up the stuff to freeze her hairstyle in time. Her hairdresser had some spare time on her hands, and did some fancy handwork. This image shows you how smart she looked. I was going for a couple of full figure images as well, but that grey backdrop just wasn’t big enough to serve as a full figure backdrop, so I dropped it (ooh, I’m rambling now!!)
Today’s last note would have to be mentioning the free vs. closed source debate that never seems to end. I’m torn between two camps and really not sure which one to go with – in one hand, I’m employed at, and make a living of closed source code. Our income is based on selling licenses for using patented software. This is bad, according to the open source community that believes that my money should be made of customizing that source instead, to make me share the code of the software I wrote with others, and then compete with them for making a living off of it. I hear them, I just can’t get myself to understanding them..
See you next time!