Building Azure Applications using Agile Methodologies and Continuous Integration

Introduction

There’s been much chatter on user group forums about the use of agile methodologies within cloud projects. We, at Blush Packages, have successfully completed a project and wanted to share our architecture with the wider community. This article is based on talks we’re currently giving to various user groups up and down the country. We hope it gives you pause for thought on how to approach cloud projects. Whilst this article refers to a TeamCity build environment, all principles can equally be applied to TFS. To further the agenda of developers using the new cloudy TFS Preview we’ll be talking on this at the UK Windows Azure Group conference in June.

Put simply, agile development practice begins with a TestFixture.

Once the first test is written using TDD principles then the whole project can begin in earnest.

A modern Microsoft web technology stack begins with ASP.NET MVC Razor for the presentation layer. We defined a production and test stack using the following key tools and libraries. The architecture and choices should come as no surprise because this is a common and powerful application stack.

The test stack we chose:

o NUnit

o NUnit Fluent Extensions

o NSubstitute

And the application stack:

· ASP MVC 3

· Razor

· Services Business Layer

· POCO Domain Layer

· Entity Framework 4.1

· Repository Pattern

Evolutionary Point

Once the stack and has been determined it’s a fairly trivial exercise to begin thinking about how to engineer Windows Azure into a solution using best practice.

To this point you generally have several passing tests and a functioning web application. In order to make room for Azure the testing approach needs to be modified.This entails refactoring controller-based Unit tests to become UI tests that are browser-driven.

The adjusted Testing stack will now include:

· NUnit

· StoryQ

· Watin

The Windows Azure SDK comes with a compute and storage emulator which contains 90% of the code you would find in the cloud. It’s essential therefore that we also plan to test our “devfabric”. The Azure Emulator should be invoked on the Test/ClassIntialize methods and torn down at the end of the TestFixture runs. Some example code to do this may look like the following:

clip_image002

clip_image004

We also have some appsettings in our test config file:

clip_image006

The csrun command line tool is used to start up the devfabric and run a packaged azure project within the emulator. The corresponding DFService is killed to free up any locked resources that the emulator would otherwise hold open to prevent subsequent test runs from completing.

The architecture we briefly touched on in the application stack was embodied in a set of web and worker roles as illustrated in the diagram below. As you’ll appreciate this is a common Service Oriented Architecture (SOA) approach with adapters used by web and worker role clients. This architecture is in fact a model Microsoft architecture depicted in many best practice guides.

clip_image008

Nightly Build Goals

TeamCity can be utilised fairly easily to do Check-in builds. It will check for changes in a source control repository (in our case we used Subversion) and on detecting a change, will do the following in the sequenced order:

· Remove any existing source local to the CI

· Get the latest source code from subversion

· Build that source

· Run any configured unit tests

· Send an email detailing build success and a summary of the changes made

At this stage none of the above had any interation with Azure. Any deployments to Azure were done on demand by Visual Studio manually and would be performed prior to exposing the deployment to its intended audience (typically the business).

Although a perfectly good solution at the start of a project, it quickly becomes untenable as your functionality footprint increases.

In order to “Continually Integrate” as agile best practice describes it’s important to ensure that code is pushed to the intended deployment target, which in this case is Windows Azure. As such this cycle needs to be automated such that “Integration Tests” can be performed directly against an Azure deployed host as part of the test lifecycle.

The aim of the nightly build were to extend our check in build to push all the way to Azure followed by the requisite testing.

Tooling

The following tools are a common approach to this problem. Many of you will have had experience of several or all of these:

· JetBrains TeamCity

· Cerebrata CmdLets

· Gallio test runner

· Powershell

· Windows cmd files

Using JetBrains TeamCity

There are a number of reasons to choose TeamCity over other solutions:

1. Familiarity. We have used TeamCity on a number of prior projects which gave us a negligible learning curve to get up and running.

2. Ease of use. A Web based UI for all tasks (configuration, monitoring etc.)

3. Cost. For a single project with a small number of developers, TeamCity is free.

4. We had to use something. Sounds obvious but we had nothing in our toolset that offered CI as part of its functionality so we had to look externally.

5. Confidence. All of the above plus choosing a tool from a respected supplier minimised the risk that we would encounter issues around the tooling.

Using Cerebrata Cmdlets

For just over $100.00 you get over 100 CmdLets to automate all aspects of your Azure deployment and its ongoing management.

If you were to code the functionality required you would have probably spent several days or weeks, end up with a non generic solution and know it would not be reusable in the same way as the cmdLets. The choice is one of simple economics and time budgeting. Writing the functionality would also be a shift in focus away from the prime goal of the task.

CmdLets have multipurpose uses and cover all aspects of the Service Management and Storage Services APIs so become invaluable in the project.

The Build

Please note this is not an instructional step by step guide on using TeamCity or any of the other tools.

In fact the steps we use here should be transferable to any build tool. Reading this will not make you an expert on any the tools used but will hopefully give you an idea of what you can achieve with them. We also do not discuss installation and configuration of these tools although we intend to in a more detailed walkthrough post planned for the near future.

It is worth noting that if any of the build steps fail, the build will stop. i.e. you should not get a bad deployment because we continued ignoring an error.

Build Triggering

When you create a TeamCity build, you need to tell it when to run. In our case we want the build to run at night and preferably at a time when no one will be checking in any source code (we are Ninjas… we check in all night!). Our build is triggered to kick off at 2.00am in the morning.

clip_image010

You will also note in the rare event that nothing has changed, that the build will not run.

Source Control Integration

The configuration settings for source control checkout (Subversion) are made once per installation. You can then use this connection in your build. (The VCS Root in this case).

It’s good practice in the build to tell where to get the latest source "too" (Checkout directory) and whether the folder should be cleaned first.

clip_image012

Build

We chose a Visual Studio build to build our solution. This could easily have been an MSBuild but in the early stages of our CI life, it helps us to have Visual Studio installed on the CI machine to trouble shoot any issues that arise.

The configuration is as shown:

clip_image014

Backup Source

Not really a step that you would see in a build, as there are more obvious ways to do this. The nightly build, however, is a convenient spot in a development life cycle where you know that you will have up-to date source in a single place consistently so it’s worth taking the time to back it up to the CI server.

Run Unit Tests

The unit tests are NUnit tests that test functionality that sit below our controllers. They test services, any facades over those services (adaptors in the architecture diagram) and any helper classes. Heavy use of mocking (NSubstitute in our case) to Isolate tests to the functionality they should be testing and the majority, if not all of these tests come about from a "Test First (TDD)" style development process.

It should be noted that developers do not commit source without first checking the stack of tests are "Green".

The configuration here is to tell TeamCity:

· Which test runner to use,

· What version of .NET is in play

· A list of assemblies containing test fixtures (classes marked with the [TestFixture] attribute)

clip_image016

Package For Azure

If you get a clean compile/build and our unit tests all run green, the build can be deployed to the Azure Staging environment. The environment looks just like the live environment that you would use for tests and allows deployment and test without touching the current live setup.

By default, a Visual Studio build does not package your application for deployment to Azure. This is part of the deployment step that you would manually trigger when you deploy from Visual Studio.

Therefore we have an additional build step to do this that utilises MSBuild. The build file path points at your Azure project file. There is no requirement here to create an MSBuild file as the .ccproj is the msbuild file.

The key to creating the publish files is the CorePublish target.

clip_image018

Publish to Azure

This is where you would make use of CereBrata CmdLet to take the assets created in the previous step and publish them to the Azure staging environment.

From a build step perspective, this is just a call to a PowerShell script. The -ExecutionPolicy Unrestricted command line option instructs PowerShell to ignore any default PowerShell execution policy restrictions.

clip_image020

Deployment Script

# Subscription Id
$subscriptionId = "OurId";
# Following example illustrates how to create a certificate object for a certificate
# present in certificate store.
$certificate = Get-ChildItem -path cert:\CurrentUser\My\OurCertId
# Name of the hosted service
$serviceName = "OurServiceName";
# Slot (Production or Staging)
$slot = "Staging";
# Package file (.cspkg) location. It could be a file on the local computer or a file stored in blob storage.
$packageLocation = "C:\PathtoPackage\OurPackage.ccproj.cspkg";
# Configuration file (.cscfg) location. It is a file on the local computer.
$configFileLocation = "C:\PathtoPackage\bin\Debug\app.publish\OurPackage.Cloud.cscfg";

# Label for deployment
$label = "Nightly Build";
# Upgrade mode. It could either be "Auto" or "Manual"
$mode = "Auto";
echo "Running Azure Deployment to staging"
Update-Deployment -ServiceName $serviceName -Slot $slot -PackageLocation $packageLocation -ConfigFileLocation $configFileLocation -Label $label -SubscriptionId $subscriptionId -Certificate $certificate
echo "Azure deployment to staging complete"

Get Url of Staging Instance

The Url for a staging instance of azure is not guaranteed to be consistent between deployments. Cerebrata provides a CmdLet to get deployment metadata and the payload of data contains the staging instance URL for a webrole.

Here is the TeamCity configuration for the build step:

clip_image022

The PowerShell script to call the CmdLet is again largely unmodified from the Cerebrata sample script apart from pushing the output to a file for further parsing.

Deployment Script

# Subscription Id
$subscriptionId = "OurId";
# Following example illustrates how to create a certificate object for a certificate
# present in certificate store
$certificate = Get-ChildItem -path cert:\CurrentUser\My\OurCertId
# Name of the hosted service
$serviceName = "OurServiceName";
# Slot (Production or Staging)
$slot = "Staging";
echo "Running GetDeployment"
echo $subscriptionId
echo $slot
echo $serviceName
echo $certificate
Get-Deployment -ServiceName $serviceName -Slot $slot -SubscriptionId $subscriptionId -Certificate $certificate | Out-File StagingDeployment.txt

Transform UI Specification configuration

This is probably a common problem for most CI unit test assemblies where the configuration might have to change from environment to environment. If this was a web.config file, you could utilise visual studio web.config transformations out of the box but not in this case.

Fortunately for us this problem has been solved by… and we use his config utilities to perform the environment changes we need for our config file.[e1]

The step configuration looks like:

clip_image024

The .cmd file looks like:

echo off
set ProjectPath=C:\ OurRootPath
set Source=%ProjectPath%\OurSpecificationsPath\app.config
Set Transform=%ProjectPath%\ OurSpecificationsPath \Transform.config
set Target=%ProjectPath%\OurSpecificationsPath\bin\debug\OurSpecifications.UI.Specifications.dll.config
C:\OurRootPath\utils\ctt.exe s:%Source% t:%Transform% d:%Target%

Run integration tests

All the previous steps lead to this. The Integration tests are a set of Watin tests (Watin, an extension for NUnit that allows you to drive a browser programmatically and test various assertions (e.g. the page contains certain text)). These are run against the newly deployed Azure instance. We could have used the NUnit test runner used in our unit tests step but instead we used Gallio for its enhanced test output.

The build step looks like:

clip_image026

The command puts the output in an html format to a known directory on the CI machine. This content is web-enabled and a link to this in our final email gets sent on build completion.

The output looks something like this:

clip_image028

VIP Swap

Assuming everything is green with our tests and our build is good enough for a "live" deployment, we run a final Cerebrata Cmdlet to perform a Virtual IP Swap (VIP Swap) from staging to production.

The build step looks like:

clip_image030

The Cerebrata based PowerShell script looks like:

# Subscription Id
$subscriptionId = "OurId";
# Following example illustrates how to create a certificate object for a certificate
# present in certificate store.
$certificate = Get-ChildItem -path cert:\CurrentUser\My\ourCertId
# Name of the hosted service
$serviceName = "OurServiceName";
# Script below moves a deployment in staging slot to production slot if production slot is empty
# otherwise it swaps staging and production slot.
Move-Deployment -ServiceName $serviceName -SubscriptionId $subscriptionId -Certificate $certificate

Conclusion

The above set of steps and explanations should give you a broad overview of the steps required to publish your applications to Azure in an automated way and run your integration tests. We are available to answer any questions on this topic. A fuller description of the above will presented by us at the UK Windows Azure User Group meeting in Manchester on the 4th April. Registration for this can be done at http://www.ukwaug.net

 


Authors

John Mitchell

John Mitchell has worked in technical roles with a number of brand names including Tesco.com, Vodafone, Volkswagon Finance, Royal Bank of Scotland. Recently John was responsible for delivering the Ticketing system for the Abu Dhabi Grand Prix. John specialises in Agile development in the Microsoft stack and is currently engaged with Integrity-software as a consultant on their Azure SAAS solution.

Rebecca Martin

Becca Martin is currently the development lead for Integrity software, she has worked on a number of Azure projects but is mainly known within the Agile space. Becca is an active speaker both in the UK and USA at user groups and events. Becca has worked on software for end customers such as Toyota, Coca-cola, Elateral, Enta Ticketing, Thoughtworks, Dot Net Solutions and The Carbon Trust. Becca recently set up the Windows Azure inside Solutions group.

About these ads

About Beth Martin
software developer

5 Responses to Building Azure Applications using Agile Methodologies and Continuous Integration

  1. Nice article, impresive job ! Congrats.

    In the same idea, friends of mine have done an ALM integration using TFS automated build (TechDays 2012):
    http://www.slideshare.net/JasonDeOliveira/techdays-2012-windows-azure-alm

    Hope you’ll work at some point on ALM with DACPAC projects to SQL Azure, particularly how to handle data upgrades (i.e. not only the Schema included in DACPAC).

    Regards,

  2. Pingback: Windows Azure Community News Roundup (Edition #14) - Windows Azure - Site Home - MSDN Blogs

  3. Pingback: Windows Azure Community News Roundup (Edition #14) - Windows Azure Blog

  4. Pingback: De Olho no Azure – 15/04/2012 « Pensando Azure

  5. Thanks for the details Becca. I use a combo of GitHub, Hudson and Powershell Scripts for continuous deployment of my Azure project. I use NUnit for local tests, but I have been struggling to get proper test coverage for my API and UI once it is deployed to staging. Watin looks promising though and I will definitely give it try.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: