SpecFlow – First month of BDDing

I introduced SpecFlow to my team in December 2013. I was really sold on BDD by three of it’s main benefits benefits:

  1. Living documentation – BDD creates up-to-date and relevant documentation
  2. Increased collaboration – BDD promotes discussion up-front and reduces handoffs that traditionally occur.
  3. Build the right thing the first time – Through increased collaboration there’s better understanding of what needs to be built and there are fewer surprises when the product is finished.

Since introduction of SpecFlow, I would say we have seen more communication between QA, developers and Product Owners. It hasn’t been a dramatic, complete change to our process, but it has improved communication. With code reviews there’s been significantly more review of tests. Reviewing tests and be a bit tedious and if you’re not familiar with an area of code reviewing all the test scenarios isn’t easy. With Gherkin tests everyone on the team can (and usually does) read them and ask questions or concerns.

Writing tests has been a pretty steep learning curve for the more junior developers. I think most people are now familiar with it, but it’s taken time. Senior developers have a completely different challenge of philosophizing how best to refine and optimize specifications.

Our specification set is starting to get large enough that we are now thinking about how best to organize them. Unless you’re a technical writer, structuring documentation is challenging and this is proving no different. Our first step has been to separate higher-level stakeholder focused scenarios from boundary conditions scenarios.  This seems to help maintain clarity for business-stakeholders and supporting the more detailed scenarios for QA and developers. SpecFlow does offer a $$$ tool for organizing the documentation – maybe… maybe we’ll look at that at some point, but

So it’s been a fun month with SpecFlow. I think it’s been a bit challenging getting going with them, but I think people have warmed to them and they’re becoming more and more part of our agreed on best-process. My presentation that kicked off the whole process is below. It’s brief and lacks an animated me flailing arms around in front of everyone, but it has some good key points. specflow-presentation

Posted in BDD | Tagged | Leave a comment

Jenkins + GitHub, MSBuild, MSTest (and SpecFlow)

After experimenting with SpecFlow a bit, now I want to try and run the tests automatically with Jenkins. It turns out it was incredibly easy. Really there are just 4 straight forward steps, which I think are best explained with screenshots.

1. Install Jenkins

Easiest way to do this is with Chocolatey.

cinst Jenkins

2. Install Plugins

I installed the GIT, MSTest and MSBuild Plugins.


3. Configure Global Settings

I had to set the paths for MStest and MSBuild.


4. Configure my project

This is a very straight forward project. It gets the latest source from GIT, builds it with MSBuild and then runs MSTest. The tests are actually written with SpecFlow, but they are compiled and ran as standard MSTests.



Final Thoughts

I realize my computer has Visual Studio, .Net and everything else I already need setup. Normally on a build server you have to install all of that from scratch. Running MSTest without Visual Studio is a bit of a pain too. MS must figure that people will be running their tests from Team System. Overall, I’m really impressed with Jenkins. It takes no time to get an automated build running with no previous experience.

Posted in .Net, C# | Tagged , , | 1 Comment

BDDing with SpecFlow

Unit testing and TDD have become standard development practices for some time. Unit Testing helps build loosely coupled software and it verifies code is functioning correctly. Where it comes up short is verifying if the software being built actually satisfies the business goals. MSTest, NUnit and most other test frameworks can be used for testing high-level business processes. Unfortunately the tests are written in code with a programming language like C# that is not easily communicated with stakeholders making it difficult to verify if the software is achieving the business goals.

Behavior Driven Design (BDD) attempts to solve this problem. Tests are written at a level business stakeholders can review and understand. Stakeholders, working with the development team write examples of how the software should behave and the examples are copied into code verbatim! The result is tests that match the business requirements because the business stakeholders wrote (or co-wrote) them. And the tests function as living documentation – the documentation does not go stale because the documentation is the tests and the tests can be run regularly as part of automated builds.

This post focuses on SpecFlow. SpecFlow is one of the more popular BDD frameworks for .Net. Most importantly, for my needs it has integration with Visual Studio and MSTest and it looks like it easily integrates with my existing project.

Getting started

1. Install the SpecFlow Visual Studio plugin:


2. Create a Visual C# – Unit Test Project

3. Install the SpecFlow NuGet package

Install-Package SpecFlow

4. Create an App.config file with the following:

    <!-- For additional details on SpecFlow configuration options see http://go.specflow.org/doc-config -->

5. Create a SpecFlow Feature File – Merge.feature in this example


6. Write SpecFlow Specification

Feature: Merge
	Merging users from one repository to my local database.

Scenario: Merge one user
	Given database is empty
	Given database does not have a user with Id 1
    When Merge User with Id 1
    Then database has user with Id 1

Hitting Ctrl-S will save the Feature file and generate the MSTest stubs. The next step is to generate the Step Definitions which are invoked from the MSTest stubs.

7. Generate SpecFlow Step Definitions

Right-click on the specification to select the Generate Step Definitions option. This generates the SpecFlow C# Binding where we map from the Specification to standard C# code.


8. Implement Step Definitions

    public class MergeSteps
        private Merger merger = new Merger();

        [Given(@"database does not have a user with Id (.*)")]
        public void GivenDatabaseDoesNotHaveAUserWithId(int userId)
            var existingUser = merger.FindUser(userId);


        [Given(@"database has User with Id (.*)")]
        public void GivenDatabaseHasUserWithId(int userId)
            merger.Merge(new User() {UniqueId = userId});


9. Run the tests

The SpecFlow tests can now be run like normal unit tests!

SpecFlow – Custom Attributes

I would like to be able to add annotations to SpecFlow specifications and have it translate to C# Attributes on the generated test methods.

Given a specification like this…

 Scenario: Merge one user alias. Deactivate an alias that was deleted.
    Given database has User with Id 1
    Given database User with Id 1 has addresses [foo1@gr.com|Active, foo2@gr.com|Active ]
    When Merge User with Id 1 and addresses [ foo2@gr.com|Active, foo3@gr.com|Active]
    Then database has user Id 1 and addresses [foo1@gr.com|Inactive, foo2@gr.com|Active, foo3@gr.com|Active ]

I want the generate coded to have my custom attribute like this.

[MyAttribute( "Merge one user alias. Deactivate an alias that was deleted.", "John.Smith", "SMG-1374,SMG-223" )]
         public virtual void Test() {...}

Out of the box Specflow only allows adding Tags to specifications. The Tags generate MSTest TestCategoryAttributes. By default, my Gherkin spec above generates the following code. Using reflection I could probably get the information I need from this implementation, but there should be a way to make it do exactly what I want.

         public virtual void Test() {...}

I see SpecFlow does have support for creating plugins. Unfortunately their documentation is somewhat lacking. Luckily, SpecFlow is open source and there are a few references out there such as this very helpful blog post – http://blog.jessehouwing.nl/2013/04/creating-custom-unit-test-generator.html.


First, create a Visual C# – Class Library Project and Import the SpecFlow Custom Plugin NuGet Package

Install-Package SpecFlow.CustomPlugin

Next create a create a derived class of MSTest2010GeneratorProvider.

    public class MyGeneratorProvider : MsTest2010GeneratorProvider
        public MyGeneratorProvider(CodeDomHelper codeDomHelper)
            : base(codeDomHelper)

For my purposes, I need the code generation to change in two ways:

  1. Add my custom attribute when “Author” and “Jira” tags are present
  2. Add custom code in the generated TestInitialize() and TestCleanup() methods

Adding Custom Attribute

The MSTest2010GeneratorProvider has a SetTestMethod to override. This seems like a logical place to add custom attributes to each of the test methods. My implementation searches for any Tags that start with Author or Jira for the given scenario. Then using CodeDom, I add my custom attribute to the code. I also remove the tags so that the base.SetTestMethod() call does not add TestCategoryAttribute’s to each of the methods.

    public override void SetTestMethod(TestClassGenerationContext generationContext, CodeMemberMethod testMethod, string scenarioTitle)
        foreach (var scenario in generationContext.Feature.Scenarios)
            if (scenario.Title == scenarioTitle)
                string title = scenarioTitle;

                if (scenario.Tags != null)
                    Tag Author = scenario.Tags.FirstOrDefault(x => x.Name.StartsWith("Author"));
                    Tag jira = scenario.Tags.FirstOrDefault(x => x.Name.StartsWith("Jira"));

                    if (Author != null && jira != null)

                        var authorText = Author.Name.Split(new[] {':'}, StringSplitOptions.RemoveEmptyEntries)[1];
                        var jiraText = jira.Name.Split(new[] { ':'}, StringSplitOptions.RemoveEmptyEntries) [1];

                            new CodeAttributeDeclaration(
                                new CodeAttributeArgument(new CodePrimitiveExpression(title)),
                                new CodeAttributeArgument(new CodePrimitiveExpression(authorText)),
                                new CodeAttributeArgument(new CodePrimitiveExpression(jiraText))));

                var text = string.Join("\n", scenario.Steps.Select(c => c.StepKeyword.ToString() + ":" + c.Text));
                        new CodeAttributeDeclaration(
                            new CodeAttributeArgument(new CodePrimitiveExpression(text))));

        base.SetTestMethod(generationContext, testMethod, scenarioTitle);

TestInitialize() and TestCleanup()

I need to add 2 public properties and logic in the Initialize and Cleanup methods. SpecFlow has SetTestClass and SetTestClassInitializeMethod methods to override, but there is no SetTestClassCleanupMethod. I found that for my needs it didn’t matter where I put the code generation logic, so I put it all in the SetTestClassInitializeMethod.

Here’s the code I used to modify the TestInitialize and TestCleanup methods:

    public override void SetTestClassInitializeMethod(TestClassGenerationContext generationContext)
        var field = new CodeMemberField()
                            Name = "testContext",
                            Type = new CodeTypeReference("Microsoft.VisualStudio.TestTools.UnitTesting.TestContext"),
                            Attributes = MemberAttributes.Private

        field = new CodeMemberField()
                        Name = "testCase",
                        Type = new CodeTypeReference("TestLogConnector.TestCase"),
                        Attributes = MemberAttributes.Private

        var codeMemberProperty = new CodeMemberProperty();
        codeMemberProperty.Name = "TestContext";
        codeMemberProperty.Type = new CodeTypeReference("Microsoft.VisualStudio.TestTools.UnitTesting.TestContext");
        codeMemberProperty.Attributes = MemberAttributes.Public;
        codeMemberProperty.HasGet = true;
        codeMemberProperty.HasSet = true;
            new CodeMethodReturnStatement(
                new CodeFieldReferenceExpression(
                    new CodeThisReferenceExpression(), "testContext")));
            new CodeAssignStatement(
                new CodeFieldReferenceExpression(
                    new CodeThisReferenceExpression(), "testContext"),
                new CodePropertySetValueReferenceExpression()));


        codeMemberProperty = new CodeMemberProperty();
        codeMemberProperty.Name = "TestCase";
        codeMemberProperty.Type = new CodeTypeReference("TestLogConnector.TestCase");
        codeMemberProperty.Attributes = MemberAttributes.Public;
        codeMemberProperty.HasGet = true;
        codeMemberProperty.HasSet = true;
            new CodeMethodReturnStatement(
                new CodeFieldReferenceExpression(
                    new CodeThisReferenceExpression(), "testCase")));
            new CodeAssignStatement(
                new CodeFieldReferenceExpression(
                    new CodeThisReferenceExpression(), "testCase"),
                new CodePropertySetValueReferenceExpression()));



        generationContext.TestInitializeMethod.Statements.Add(new CodeSnippetStatement(
                                                                    @"            if (TestContext != null)
            TestCase = new TestLogConnector.TestCase(GetType(), TestContext.TestName);

        generationContext.TestCleanupMethod.Statements.Add(new CodeSnippetStatement(
                                                                    @"           if (TestContext != null)

Referencing the Plugin

The next step is to implement IGeneratorPlugin and register my custom unit test provider.

[assembly : GeneratorPlugin (typeof (MyGeneratorPlugin ))]

public class MyGeneratorPlugin : IGeneratorPlugin
        public void RegisterDependencies( ObjectContainer container)

        public void RegisterCustomizations( ObjectContainer container, SpecFlowProjectConfiguration generatorConfiguration)
            container.RegisterTypeAs< MyGeneratorProvider, IUnitTestGeneratorProvider >();

        public void RegisterConfigurationDefaults(SpecFlowProjectConfiguration specFlowConfiguration)

The final step is to compile the assembly (hopefully it compiles) and place the libary in a patch accessible from SpecFlow. They are loaded from several locations. GeneratorPluginLoader.cs contains the logic for how SpecFlow loads plugins.

I found the easiest place to put the plugin is in the /packages/SpecFlow.1.9.0\tools folder.

Finally reference the plugin in the main projects App.Config file:

      <add name="MyGenerator" type="Generator"/>

Next time the Feature file is saved the code will be generated using the plugin!


I’m interested to see how well SpecFlow works over an extended period of time. My hope is that it provides long term value to my project in the form of better documentation and more easily maintained tests.

All source code for this can be found here:


Posted in .Net, C#, Unit Testing | Tagged , , , , , | 1 Comment

DotNetFlume Client Pre-Release with Avro!

I spent considerable effort porting the Java Avro RPC support to C#. The Avro C# RPC code has recently been merged into the Avro SVN trunk. I’ve been using the patch at work for several months now with no problems. I figure now is a good time to make a pre-release of the DotNetFlumeNG client with the Avro support. There’s not a lot of uptake on the log4net client so for now I’m only supporting NLog for the Pre-Release.

To install run:

Install-Package DotNetFlumeNG.Client.NLog -Pre

The NuGet package and more documentation can be found here:

Posted in .Net, C# | Tagged , | Leave a comment

The Release Manager – whatsgoingon.exe

Versioning software is a large topic. When working in an Agile Software environment, where you’re releasing small, frequent releases it can be very difficult to know what goes in to each release. On a large enterprise project I work on, we maintain several APIs. Consumers interact with these APIs from a Thrift interface. They are primarily interested in knowing what version of the API has my request? On this project we push builds out to internal customers multiple times a day. Our build server can also create builds off feature branches so the builds are in various states of done. Keeping track of what changes are in what version is a full time job. It’s also a job that’s easily automated with. Enter…

The Release Manager!!!

Enter the Release Manager. The Release Manager monitors an artifacts folder and computes the differences between class files that it is tracking. For my application I’m interested in versioning thrift interfaces so it converts the C# changes to the relevant thrift IDL interface. Say what? Yup. It is a bit of a hack to go from C# to Thrift. The hack works surprisingly well. So well that I think this could be extended to make a very cool Code-First-Thrift-from-C#-tool.


The code is definitely a little rough and specific to my problem space. It is data-driven, but I would guess code changes are needed for most different applications. If I find time, I’ll look at extending and generalizing the solution.

There are two inputs to the release manager:
1. A folder with sub folders for each build. Each sub folder has .Net assemblies.
2. A git repository.

There is a hard requirement that the folder name includes the shortened GIT hash;  Alternatively the shortened git hash can be stamped in one of the assemblies.

Atlassian JIRA integration is optional and requires Atlaassian Stash and a specific GIT workflow. The GIT workflow requires feature branches to be named the same as a JIRA ticket and feature branches are merged using pull requests. With this workflow, the Release Manager can associate specific JIRA tickets with each version of compiled software.
There are two additional dependencies:
1. A SQL database for storing differences
2. A IIS site for viewing differences

There are four stages to processing changes. Each of these stages have a unique command line parameter. Each of the stages can be run independently of the others.

* –artifacts | Scans through the artifacts folder and adds new artifacts to SQL database
* –setPreviousId | Scans the SQL database and figures out the order of the artifacts
* –processDiff | Process artifacts in SQL and computes the differences between versions
* –jiraFromGit | Scans through the JIRA ticket numbers and finds the associated GIT commits

Architecture – Component Diagram:

The components and their interactions can be seen here. A major driver of the tool was for this to be simple and loosely coupled, while providing definitive information that is integrated with our GIT source control, JIRA ticketing system and build server. whatsgoingon.exe is designed to be run as a windows scheduled task to provide near real time information about what changes are made to each and every build.



All source code can be found on Git Hub at: https://github.com/marksl/whatsgoingon

Below are pictures of a large enterprise project. On Git Hub there’s also a that can be  generated by running create-db.sql, build.bat and whatsgoingon.exe –artifacts –setPreviousId –processDiff.

versions-running-list versions-diff-between

Posted in .Net, C# | Leave a comment

Installing IIS with Chocolatey

Chocolatey is one of those technologies that I find very exciting. Installing applications is hardly an arduous task, but being able to do it from the command line is damn cool. And it’s really a feature that Windows should have by now.

While installing IIS on a new computer, I discovered that Chocolatey has expanded it’s feature set to include installing Windows Features.

PS C:\> cWindowsFeatures IIS-WebServerRole
PS C:\> cWindowsFeatures IIS-ISAPIFilter
PS C:\> cWindowsFeatures IIS-ISAPIExtensions
PS C:\> cWindowsFeatures IIS-NetFxExtensibility
PS C:\> cWindowsFeatures IIS-ASPNET

PS C:\> chocolatey install evernote
PS C:\> chocolatey install resharper

The resulting output looks like this:


That’s all there is to it! This is not all that much faster than doing it from the UI, but it is now simple to automate this via Powershell!

Posted in .Net, IIS | Tagged , , | 1 Comment

SQL Server – Insert a file into a nvarchar column

Note to self: It’s easy to import an entire file into SQL Server.

Given a table that looks like this:


CREATE TABLE [dbo].[TableWithLargeColumn](
[Id] [int] IDENTITY(1,1) NOT NULL,
[Data] [nvarchar](max) NULL,
[Id] ASC

You can insert a text file into the Data column like this:

insert into TableWithLargeColumn(Data)
select BulkColumn FROM
OPENROWSET (BULK N'C:\Users\public\testdata.txt', SINGLE_CLOB ) as Document

Posted in SQL | 2 Comments