Twitter Feed Popout byInfofru

Adventures in .NET Software Craftsmanship!

Automating EF 4.3.x Data Migrations in your Build

I was asked to clarify what I meant by “some combinations of parameters not working” so I took it upon myself to rewrite the post to be a bit more specific about the problems that I had when using the migrate.exe tool.

If you are looking for a way to automate the execution of your Entity Framework data migrations in your build process the data team has provided us with a command line tool, migrate.exe,  that exposes all the commands available to us in the Package Manager Console in Visual Studio.  The tool is already conveniently part of the Entity Framework’s NuGet package tool folder.  the tool works as advertised, but it does have some quirks, but first let me show my two favorite ways to utilize the tool. If you are totally new to EF data migrations do visit David Hayden’s blog post with a very basic tutorial on the topic which is simple and to the point, or visit Pluralsight and purchase Julie Lerman’s amazing 1 hour tutorial on the topic (see the details here).

Two ways to use Entity Framework’s Migrate.exe
Option 1 – Reusing the app.config

This is the most straight forward option as you are reusing the settings in your applications app.Config to execute the migrations from the command line so the margin for error is relatively small.  The downside is, if you have different connection string settings for all the different environments you want to target with the database migration tool then you will have to come up with a strategy to modify the settings of the app.Config, one possible solution is, if you are working on top of a Web Application project, like an MVC application, you could use web.config transformations during the build process to change application settings.

@rem run_db_migrations.cmd
SET CurrentPath=%CD%
SET ConfigFile=%CurrentPath%\Data\App.config
SET MigrateExe=.\packages\EntityFramework.4.3.1\tools\migrate.exe

%MigrateExe% Data.dll /StartUpDirectory:%CurrentPath%\Data\bin\Debug\ /startUpConfigurationFile:"%ConfigFile%"

The following table explains the data being passed to the command line tool.

Options Comment
Data.dll The assembly with the DbContext and migrations to be executed.
/StartUpDirectory This is the directory where your assembly is located.
/startUpConfigurationFile This is the path to the configuration file that holds the connection string to be used.
/verbose [Optional] use this option to have the tool output all SQL being executed or generated to the command console.


Option 2 – No app.Config

This is by far my favorite approach as you will not need to write complex scripts to modify your app.Config to accomplish targeting multiple environments. I don’t believe there is really a downside to this approach. 

@rem run_db_migrations.cmd
SET StartUpDirectory=%CD%\Data\bin\Debug\ 
SET ConnectionStringProvider=System.Data.SqlClient
SET MigrateExe=.\packages\EntityFramework.4.3.1\tools\migrate.exe

%MigrateExe% Data.dll /StartUpDirectory:%StartUpDirectory% /ConnectionString:"%ConnectionString%" /connectionStringProvider:%ConnectionStringProvider%

The following table explains the data being passed to the command line tool.

Options Comment
Data.dll The assembly with the DbContext and migrations to be executed.
/StartUpDirectory This is the directory where your assembly is located.
/ConnectionString This is the full connection string for your target environment.
/connectionStringProvider This is the ADO.NET provider type to be used when executing against your target environment.
/verbose [Optional] use this option to have the tool output all SQL being executed or generated to the command console.



I attempted to learn how use this tool by using the command line executable and just messing with the parameters to see which ones where needed and that approach backfired as in some scenarios I wasn’t getting much feedback from the tool as you can see from the following screenshot.  Now, I admit I made the mistake of not providing the connectionStringProvider to the command line tool but I expected it to handle this more gracefully.




Automating your Entity Framework data migrations in your build server is extremely simple and flexible, so feel free to go version and manage your database the efficient way.

CapArea.NET–jQuery Mobile Introduction


imageAs promised, the following is the link to the bitbucket repository with all the sample code for my presentation on jQuery Mobile on the ASP.NET MVC platform.

I would like to thank everybody who attended the user group and once again thank the Capital Area.NET User Group leadership for allowing me to share my ideas and thoughts in such a great venue.  Now, go build something and lets conquer the mobile space!

WebPiCmdLine–Easily provision your Web Servers!

The Web Platform Installer is Microsoft’s solution for distributing framework, products and applications that revolve around their web platform solutions.  If you are a web developer like me, it is probably the first thing you need to install on your development box or on the servers you manage.  Needless to say, I am a big fan but I don’t think I realized its true potential until I stumbled into the command line version of the Web Platform Installer, also known as the WebPiCmdLine.  Now, I get to do a lot of the infrastructure work in the projects that I take part of that means provisioning servers, installing the Continuous Integration software (Build Server), automating deployments to staging and production environments. In other words, I implement and design Deployment Pipelines for Continuous Delivery processes.  Anyway, too much conversation, lets look at a couple of working examples of when the WebPiCmdLine is useful and convenient.


Example 1 – Ramping up a Basic Developer Box

Lets say you want to install basic development tools on a Windows 7 Professional fresh install, perhaps you are performing a demo of a basic web application or maybe you are installing the latest and greatest technology and don’t want to ruin your current development environment.  Well one way you could setup all the required software on a fresh install  is to download the Web Platform Installer and browse through their list of frameworks, products and applications and click which products you want to install and your done.  The drawback though, is that you would have to be clicking around to find all the products and accepting all the EULA (End User License Agreement), it just takes too much attention and time.  The quick and easy way is to use the WebPiCmdLine to do the work.  Now my assumptions for the following snippet of code is that you have downloaded the WebPiCmdLine and extracted its content to a local folder on your drive, in my case “C:\Installer\WebPiCmd\”, the following script is located in the “C:\Installer\” folder.


@rem installer.cmd
SET WebPiCmd=.\webpicmd\WebPiCmd.exe
%WebPiCmd% /Install /Products:NETFramework4 /AcceptEula
%WebPiCmd% /Install /Products:SQLExpress /SQLPassword:P@ssw0rd /AcceptEula
%WebPiCmd% /Install /Products:SQLManagementStudio /AcceptEula
%WebPiCmd% /Install /Products:VWD /AcceptEula
%WebPiCmd% /Install /Products:MVC3 /AcceptEula
%WebPiCmd% /Install /Products:MVC4VS2010 /AcceptEula


Example 2 – Web Server for ASP.NET MVC Applications

The setup would be the same as above but in this scenario we are installing on top a Windows Server 2008 R2 fresh install.  This would be our web server to showcase a basic ASP.NET MVC application.


@rem installer.cmd
SET WebPiCmd=.\webpicmd\WebPiCmd.exe
%WebPiCmd% /Install /Products:NETFramework4 /AcceptEula
%WebPiCmd% /Install /Products:IIS7 /AcceptEula
%WebPiCmd% /Install /Products:MVC3 /AcceptEula
%WebPiCmd% /Install /Products:MVC4VS2010 /AcceptEula



I wrote a blog, not so long ago, about installing Rails on Ubuntu and how I had scripted out a basic install to ease the process of rolling a new Ubuntu image and rebuilding my developer environment.  The script that I wrote was fairly complex and tedious, but if a Linux noob like me could write it anybody can.  At the time, I felt there was no equivalent easy way to perform quick and clean scripted installations on the Windows platform but recent research has shown me that times are changing.  The WebPiCmdLine is just one of the relatively new tools (see NuGet, Chocolatey, or PowerShell)  that are providing developers the power to write deployment and automation scripts for the windows platform.  One comment, I do feel that the Web Platform Installer is powerful and useful enough that it really shouldn’t be only associated with the web, why not re-brand it as the Windows Platform Installer, and let it be used for installing anything on the operating system.

Enable NuGet Package Restore – When the basics fail!

Recently, checking in third party libraries “dependencies” into your source code repository has become a frowned upon practice. The reason being it increases the size of your repository, specially if you are using a DVCS, and every commit increases the size of the repository exponentially till the point when interacting with the repository takes valuable time away from the developers due to network and disk I/O use. 


In the .NET space the tool of choice for centralizing and managing external dependencies is currently NuGet (  NuGet is a Visual Studio extension that allows developers to right-click and add they’re favorite open source libraries from a centralized repository.   NuGet also provides the capability to enable automated  package restore capabilities (, which means, that if your dependencies are not downloaded to your computer they automatically will be when you compile your code base from the Visual Studio IDE.  This feature ensures that even though we are changing the traditional model of managing external dependencies we are not changing the traditional “F5 Build” experience that most developers are familiar with and expect. 


However, there are some scenarios where this process fails, for instance when a NuGet package modifies the build by adding custom MSBuild Targets to your project files (.csproj). Visual Studio, or MSBuild for that matter, are not going to be able to load the .csproj to build your project since the build depends on the packages that the build package is going to download prior to compile.  If you followed what I said, there is a circular dependency between the build process and the NuGet packages, since now the build depends on the NuGet packages to run. One example of such NuGet package is PostSharp (, which is a very valuable AOP tool that needs to integrate into your build process to do its work, so I am not complaining just pointing it out. 


A Workaround

One possible workaround is to initialize the process of retrieving dependencies, or to use a term stolen from my co-workers blog, kick start the process  before you even open up Visual Studio.  There are many ways this could actually be accomplished but I choose to build a batch file (.cmd) and an MSBuild project file that will do all the work for you.


<!-- KickStart.msbuild -->
<Project DefaultTargets="KickStart"
        <NuGetExe Condition="$(NuGetExe) == ''">.\.nuget\nuget.exe</NuGetExe>
    <Target Name="KickStart">
            <NuGetPackageConfigs Include="**\packages.config" />
        <Exec Command="$(NuGetExe) install %(NuGetPackageConfigs.Identity) -o packages" />


The MSBuild file, in a nutshell, just finds the packages.config files in your solution using a relative path search and executes the NuGet command line executable to populate the packages folder.  Assuming you use regular Visual Studio conventions for where your project files (.csproj, .vbproj, etc.) are located, you should be fine by placing both these files right next to your solution (.sln) file.


@rem KickStart.cmd
SET MsBuildPath=C:\Windows\Microsoft.NET\Framework64\v4.0.30319
SET NuGetExe=.nuget\nuget.exe
%MsBuildPath%\MsBuild.exe kickstart.msbuild /t:kickstart /p:NuGetExe=%NuGetExe%


The batch (.cmd) file is just a shortcut for executing the MsBuild project file.



This is just one way to solve this issue, but hopefully the solution I have outlined helps you if you are ever faced  with the need to kick start your NuGet dependencies outside the Visual Studio IDE.  The good thing about this problem is that only relevant when you get a new clone of your repository at that point you just execute the kickstart.cmd file once or in your build server where this process should be happening continuously.

Book Review: NHibernate 3 Beginner’s Guide


imageNHibernate has been the ORM (Object Relational Mapper) of choice on practically all projects I have been involved in over the last couple of years and I have nothing but positive things to say about my personal experience with NHibernate.  That being said, I do remember back in 2007 when I was faced with the task of evaluating NHibernate as a replacement for a brittle hand coded DAL (Data Access Layer) that there was scarcely any documentation or books for –true- beginner's.  Obviously, the landscape has changed a bit since,  there is a lot of documentation out there in the form of blogs and books that will guide you in the task of using NHibernate to its fullest potential but it is still very hard to find a good beginner’s guide.

NHibernate 3 Begginers Guide” fills the need for a thorough beginner’s guide for NHibernate.    The material is well presented and organized, and as with most books that are published by PACKT ( the material is code and example driven which in my opinion fits best the way most developers learn these day’s. 

A hidden gem in this book is Chapter 7 which provides a general introduction on how to monitor, test and profile you data access strategy.  NHibernate’s design for testability is in my opinoin the key feature that sets it apart from other ORM’s in the .NET space.

I would recommend this book for those learning NHibernate or thinking about choosing NHibernate for their data access technology as it answers most of the questions that you might have before getting up and running and building solid applications.  As a bonus, the book also touches on some more advanced topics like what mapping strategy should I use and how to develop and implement a testing strategy for my data access code.