Recent Tweets

Entries in deployments (5)


Build Automation Part 1: Overview and Pre-build Tasks

Series Index

Build Automation Part 1: Overview and Pre-build Tasks
Build Automation Part 2: Building and Packaging
Build Automation Part 3: App Deployment Script
Build Automation Part 4: Database and Report Deployments

A recent major application release at work required various changes to our build and deployment process. Our continuous integration process was created probably 5+ years ago when Cruise Control .Net was king. It has served us well but I did not want to invest in it anymore, especially with alternatives like Jenkins and Team City out there. CI aside I wanted to reevaluate the build and deployment process as a whole and overhaul as much of it as possible to help ease the burden.

In this post I will discuss an overview of this initiative and address build script details in terms of pre-build activities. If time, interest, and priorities permit, I will create follow-up posts on topics such as build and packaging details, deployment scripts, and the setup and configuration of Team City.

Goals of the Build and Deployment Overhaul

  • Build script - Producing a more complete build script that could be run both outside and inside a CI process to build and package a deployment.
  • CI system change - Replacing Cruise Control with Team City.
  • Further automation - Automating some features that were previously manual such as ClickOnce certificate installation.
  • Deployment packaging - Compressing the app files to be deployed into a self-extracting archive that runs a deployment script after extraction.
  • Deployment script rewrite - Replacing an old deployment batch file with a more capable PowerShell script.
  • Remote deployment - The original process required performing the deployment locally on the target server but the desire was to be able to deploy from any given location to any given location remotely.
  • Existing deployment backup - Compressing the previous installation before overwriting it in the event of issues.
  • Dead-simple production deployments - While our CI process automated deployments to some environments, we have not been in a position to do this for production due to auditing, policy, security, and other reasons. The existing deployment script required passing in some parameters and some knowledge of how things were setup. With handing over prod deployments to new parties, there was a need to simplify things as much as possible to reduce the knowledge requirement and the chance of human error.
  • Artifact deployment enhancements - Additional automation around deploying database and report changes.

What to Do in a Build Script vs. a CI Tool?

The ultimate goal here is automating the building, packaging and deployment of an application and CI is the preferred way to accomplish that. However I am of the opinion that this automation shouldn't be hopelessly dependent on CI to work. There may be times when a build and deployment may need to be done from a developer's machine. Additionally the development, testing, and debugging an automated build is more easily done outside of a CI process.

On one end of the extreme, the build steps and tools of a CI system like Team City could be used with no build script at all. Separate steps can be configured in a web interface of the CI tool to retrieve source code, compile a solution or project, install packages via NuGet, run unit tests and much more. In this scenario the CI build configuration effectively is the build script.

On the other end of the extreme, every single action could be done in a build script and the CI system could just call that build script.

The CI-centric approach has the advantages of a simple wizard-like setup, leveraging the power of a CI system, and avoiding doing a lot of potentially tedious work in what might be a large build script otherwise. A complete build script has the advantages of precise control and independence from a CI tool but comes at the cost of more work and maintenance and likely means the CI tool is being under-utilized.

As with most things I think nirvana lies somewhere in the middle. Here is what I settled on:
  • Source Control - Done from a CI tool. When building locally I usually already have the latest source code. From a CI build, tools like Team City are pretty smart about efficient VCS operations in terms of caching, settings reuse, dependent builds, VCS build triggers, labeling, supported platforms etc.
  • Building - pre-build activities (NuGet etc.), code compilation, and packaging the deployable files are done from the build script. These are the core pieces that may need to be done both inside and outside of a CI tool.
  • Unit tests - Done from a CI tool. I may not always want to execute unit tests when building the app locally but from a CI build I always want some level of tests run. Decent CI tools support most popular unit testing frameworks really well.
  • Code analysis and other - Most other tasks such as code analysis or build notifications are left to a CI system to handle as they are well-suited for these tasks.

Choosing Tools

Build Tool

At the risk of turning you away, I ended up going with MSBuild, partially because I was somewhat familiar with it, there was some MSBuild content invoked from Cruise Control that was salvageable, and it has a solid base of support. I think MSBuild is a decent tool capable of far more than the simple compilation tasks many are familiar with from project and solution files. Mostly I am just not fond of XML-based build tools; I would much rather use a scripting language.

Had I rediscovered PSake sooner I probably would have chosen it instead. This Ayende post on Psake is a bit older but he discusses Psake a bit and compares it to some other options such as NAnt, Rake, and Bake and I think many of the points are still valid.

Some of the MSBuild pain would be offset by many of the great MSBuild task collections out there such as the MSBuild Extensions Pack. Additionally PowerShell script can be embedded directly inline into the an MSBuild script (through a PowerShell task) or MSBuild can execute an external PowerShell script file to offset some of the XML heaviness.


For the deployment I chose PowerShell to extract the build archive that MSBuild creates, backup the existing installation, stop services, disconnect server sessions, copy files, install and start services, and more. The idea was to run this deployment script directly from a self-extracting executable without someone having to launch PowerShell, dot source the file, invoke functions, etc.

Initial Prerequisites

For this app's build (.net Winforms ClickOnce smart client) the following needs to be installed:
  • Windows SDK or Visual Studio depending on whether the build is on a dev box or CI server
  • .NET FX 4.0 for MSBuild
  • MSBuild Extension Pack is used for several custom tasks. Used the x86 v4.0.4.0 installer.
  • TFS Explorer - on the build server for TFS source control access
  • Software component dependencies of the app being built. For this app this included items such as:
    • Oracle Data Access Components (ODAC)
    • Infragistics NetAdvantage Controls
    • Office Outlook PIAs
    • Exchange Web Services Managed API (EWS)

Where to Store Build Script Files

I went with the convention of creating a build folder under each application branch; in our case with TFS source control usually that is at least Dev, Main, and Production. Often the build scripts and configuration will be nearly identical between branches but there may be some configuration or other differences. As changes to the build scripts are made in the Dev branch and verified they will later be merged to Main and Production over the iteration lifecycle.

The app's code can be referenced easily in relative fashion by going up one level. Any temporary output of the build script will be created in an artifacts subfolder of the build folder.

All the contents of the build folder are stored in source control, minus the output in the artifacts subfolder. During a CI build, all of the build script items will be pulled from source control along with the rest of the code and resources for the application.


The first task will be cleaning up any output from previous builds and ensuring the build is started with a clean slate. First some properties and item groups are defined at the top of the build script to reference various files and folders that will be cleaned up.
    <Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>

	<BuildRoot Include="..\build\"/>
	<BuildArtifactsItem Include="..\build\artifacts\"/>
	<SolutionFile Include="..\MyApp.sln"/>
	<NuGetPackagesFolderItem Include="$(NuGetPackagesFolder)"/>
	<LicenseFiles Include="..\**\licenses.licx" Exclude="..\build\licenses.licx"/>
<ItemGroup Label="MiscOtherBuild">
	<!-- Removed next 2 from main app solution as rarely used/changed or has dependency issues -->
	<MiscOtherToBuild Include="../MyApp.ServiceController/MyApp.ServiceController.csproj"/>
	<MiscOtherToBuild Include="../ThirdParty.Library/ThirdParty.Library.vbproj"/>
	<ServiceAppBuild Include="..\..\..\..\Common\SomeServiceApp\$(Branch)\Code\build\"/>
The clean target invokes the clean target in the main app solution, related project files outside of the main solution, and a dependent service application to cleanup compiled output. It also removes a build artifacts folder that will get created with the results of the build process. Finally it deletes license.licx files and the NuGet packages folder to ensure a fresh pull.
<Target Name="Clean">
	<Message Text="Cleaning up"/>

	<!-- do a clean on solution first since it has the most output, leave custom tasks at end -->
	<MSBuild Projects="@(SolutionFile)" Targets="Clean" 

	<Message Text="Removing existing content in @(BuildArtifactsItem)"/>
	<!-- Subdirectories could exist which can lead to error "directory is not empty" 
	removing top level dir normally -->
	<MSBuild.ExtensionPack.FileSystem.Folder TaskAction="RemoveContent" 
		Condition="Exists(%(BuildArtifactsItem.FullPath))" />
	<Message Text="Removing @(BuildArtifactsItem) folder"/>
	<RemoveDir Directories="@(BuildArtifactsItem)"/>

	<Message Text="Removing NuGet packages folder @(NuGetPackagesFolder)"/>
	<MSBuild.ExtensionPack.FileSystem.Folder TaskAction="RemoveContent" 
		Path="@(NuGetPackagesFolderItem)" Force="true" 
		Condition="Exists(%(NuGetPackagesFolderItem.FullPath))" RetryCount="5" />
	<RemoveDir Directories="@(NuGetPackagesFolderItem)" />

	<!-- File could be readonly; we could add condition of not readonly. Continue on error -->
	<Message Text="Removing any existing licenses.licx files"/>
	<Delete Files="@(LicenseFiles)" ContinueOnError="true"/>

	<MSBuild Projects="@(MiscOtherToBuild)" Targets="Clean" 
	<MSBuild Projects="@(ServiceAppBuild)" Targets="Clean" 

	<Message Text="Clean complete"/>

The Path to Compilation

A Compile target is started with a DependsOnTargets value of a $(CompileDependsOn) property to be defined that will list all the targets that must be executed successfully before compilation:
 <Target Name="Compile" DependsOnTargets="$(CompileDependsOn)">
	<!-- ... -->
This property is defined as follows at the top of the build script.


<Target Name="Init" DependsOnTargets="Clean">
	<Message Text="Making build artifacts dir of @(BuildArtifactsItem)"/>
	<MakeDir Directories="@(BuildArtifactsItem)"/>

	<Message Text="Build number is $(BUILD_NUMBER)"/>
The Init target first creates a build artifacts folder where output of the build will be placed.

It then invokes a custom, inline PowerShell task to create a "build information file" that will store metadata such as the app version, file version, built on time, ClickOnce Publish Version etc. This file was created because originally a target was being called from multiple locations to get the file version and the file version was based in part on the date/time which could change during the course of the build. Later it also was used to store and increment the ClickOnce revision number.

Later the build script will read in this file into an object, update various properties and write it back out.
<UsingTask TaskFactory="PowershellTaskFactory" TaskName="CreateBuildInfoFile" 
  $log.LogMessage([Microsoft.Build.Framework.MessageImportance]"High", "Creating build info file for later use")
  $obj = new-object PSObject
  $obj | add-member -membertype NoteProperty -name "AppVersion" -value "$(BUILD_NUMBER)"
  $obj | add-member -membertype NoteProperty -name "FileVersion" -value ([System.DateTime]::Now.ToString("yyyy.MM.dd.HHmm"))
  $obj | add-member -membertype NoteProperty -name "BuiltOn" -value ([System.DateTime]::Now.ToString("G"))
  $obj | add-member -membertype NoteProperty -name "ClickOnceRevision" -value 0
  $obj | add-member -membertype NoteProperty -name "ClickOncePublishVersion" -value ""
  $obj | export-csv $(BuildArtifacts)BuildInfo.csv -notypeinformation
This task is part of the MSBuild Extension Pack and the $(PowerShellAssembly) value is defined as:

The Build Number and Calling the Build Script

In the Initialization target you may have noticed $(BUILD_NUMBER) referenced in a couple of places. During a CI build, Team City was used and it sets this property which can be referenced in the build script. When building outside of a CI process, this property value will need to be set when msbuild is invoked.

I prefer to invoke MSBuild build scripts from PowerShell. In my PowerShell $profile I add the .net framework path to the PATH environment variable so I do not have to specify the location of MSBuild each time.
$env:path = $env:Path += ";C:\Windows\Microsoft.NET\Framework\v4.0.30319"
Usually I will invoke an "Open PowerShell here" action from the build script's folder in Windows Explorer and then execute the build script with something like:
msbuild /p:BUILD_NUMBER=
Depending on what steps need to be performed or tested the invocation will vary a bit:
msbuild /t:Compile /p:BUILD_NUMBER= /p:Configuration=Release
msbuild /t:Clean

Creating License Files

Component vendors such as Infragistics require license.licx files for design time use of controls. We do not store these license files in source control for a variety of reasons but the build will generate exceptions if these files do not exist. This task copies a zero-length license file to the property directories to prevent exceptions during compilation.

This target first gets a reference to all AssemblyInfo files that reside in the same directories where the license files need to be. The actual directory name varies between Properties and My Project depending upon the language. The directory of each file is used as the destination to copy the empty license file to.

Initially I tried creating an empty license file using the File.WriteLines method of the MSBuild extension pack. That did not work though they since addressed it but I never tried the changes.
<Target Name="CreateLicenseFiles">
	<CreateItem Include="..\**\AssemblyInfo.*">
	  <Output TaskParameter="Include" ItemName="PropertyDirList" />

	<Message Text="Checking/creating licenses.licx files"/>
	<Message Text="%(PropertyDirList.RootDir)%(PropertyDirList.Directory)" />


NuGet Package Restore

Initially I thought that with NuGet Package Restore enabled in the app's Visual Studio solution that when the code was compiled by MSBuild that any missing NuGet packages would be automatically downloaded and installed. As it turned out it was not all rainbows and unicorns as that only worked in the context of Visual Studio. It was simple enough however to enumerate all packages.config files and run the NuGet command line to install the packages in each.
	<NuGetExe Condition="$(NuGetExe) == ''">..\.nuget\nuget.exe</NuGetExe>

<Target Name="NuGetPackageRestore">
  <NuGetPackageConfigs Include="..\**\packages.config" />

<Message Text="Restoring NuGet package(s). Nuget path is $(NuGetExe)" />
<Exec Command="$(NuGetExe) install %(NuGetPackageConfigs.Identity) -o $(NuGetPackagesFolder)" />

ClickOnce Certificate

The required certificate for a ClickOnce-distributed Windows app has been a pain in the past. First we tend to forget what the password is after a while and end up trying several common variations. For another the cert expires after a year by default so we end up generating another one. Then on the build server the build would break and someone had to logon as the build account, manually import the new certificate, enter the password, fire off MSBuild and verify the fix. I wanted to get away from all that as much as possible by having the build script install the certificate.

To install the certificate the password used to create it will be needed so a property is defined for that. It may not be required but for good measure the script first removes the certificate if it exists before installing it. On a build server this might be desirable once the app has finished compiling.

To remove any existing certificate it needs to be uniquely identified which is done through its thumbprint. On a machine where the cert is already installed the cert can be found through browsing IE=>Tools=>Internet Options=>Content=>Certificates=>Personal; see for more details. Issued to/by and Issue Date (Not Before) are the key fields to look at.

Another way to search for the cert is with PowerShell. For example if you know the thumbprint starts with E595 you could run:
cd cert:
dir -recurse | where {$_.Thumbprint -like '*e595*'} | Format-List -property *
Once the right cert is identified the thumbprint can be copied into a property in the build script. I do not recommend copying from IE however as you may pick up special unicode characters that are hard to see and will prevent a match in finding the cert. Also all spaces would need to be removed and letters would need to be uppercased. It is easier to copy the thumbprint data from PowerShell where the format is already correct.
<PropertyGroup Label="ClickOnceProps">
The Certificate task in the MSBuild Extension Pack is used to remove a certificate via supplying the thumbprint:
<Target Name="RemoveClickOnceCert">
	<!-- Continue on error in case it doesn't exist, will just be a warning -->
	<Message Text="Removing any existing ClickOnce certificate"/>
	<MSBuild.ExtensionPack.Security.Certificate TaskAction="Remove" 
		Thumbprint="$(ClickOnceCertThumb)" ContinueOnError="true"/>
Likewise the Certificate task is used to install the certificate via supplying the filename and password. The thumbprint and subject properties are output for verification purposes.
<Target Name="InstallClickOnceCert" DependsOnTargets="RemoveClickOnceCert">
	<Message Text="Installing Certificate file @(ClickOnceCertFile)"/>
	<MSBuild.ExtensionPack.Security.Certificate TaskAction="Add" 
	FileName="$(ClickOnceCertFile)" CertPassword="$(ClickOnceCertPass)">
	  <Output TaskParameter="Thumbprint" PropertyName="TPrint"/>
	  <Output TaskParameter="SubjectDName" PropertyName="SName"/>
Instead of using the Certificate task in the MSBuild extension pack, using PowerShell directly to install the certificate is another option. For example, see this script on poshcode or this post by James Kehr.

Updating AssemblyInfo

Prior to starting the compilation various info in the AssemblyInfo files should be set including:
  • Assembly Version
  • File Version
  • Copyright
  • Company Name
There are a few different ways to do this including TeamCity's AssemblyInfo Patcher, the AssemblyInfo task of the MSBuild Community Tasks, MSBuild Extension Pack's AssemblyInfo class, regular expressions and replacements with PowerShell or MSBuild tasks, etc. For me the patcher was out since it was CI-only and the other tasks had usage requirements I did not like. I chose the regular expression route though RegEx solutions do tend to be brittle.

First the AssemblyInfo target requires that BUILD_NUMBER is set, otherwise it is skipped. Next a couple of RegEx patterns are defined. There are two for AssemblyFileVersion as I found some formatted as AssemblyFileVersion( "" ) and the rest as AssemblyFileVersion("") due to differing developer formatting settings (again, this approach is more brittle). It probably would have been trivial to combine the two regular expressions into one but my RegEx skills are rusty and I was feeling lazy at the time.
<Target Name="UpdateAssemblyInfo" Condition="'$(BUILD_NUMBER)' != ''">
      <AssemblyInfoFiles Include="../**/Properties/**/AssemblyInfo.cs;" />
      <AssemblyInfoFiles Include="../**/My Project/**/AssemblyInfo.vb;" />
      <ClientAssemblyInfo Include="../MyApp.Client/My Project/AssemblyInfo.vb"/>

      <!-- 2nd ver handles spaces for those that have formatting like AssemblyFileVersion(  ". Should combine regexs -->
      <AssemblyFileVersionPattern2>AssemblyFileVersion\( +"\d+\.\d+\.\d+\.\d+" \)</AssemblyFileVersionPattern2>

	<!-- ... --->
Next a target is called to get the file version:
	<Output TaskParameter="ReturnValue" PropertyName="FileVersion"/>
Which is retrieved elsewhere using the BuildInfo file written to earlier:
<UsingTask TaskFactory="PowershellTaskFactory" TaskName="GetFileVersion" AssemblyFile="$(PowerShellAssembly)">
      <ReturnValue Output="true"/>
            $obj = import-csv $(BuildArtifacts)BuildInfo.csv
            $returnValue = $obj.FileVersion
Next the current year is retrieved into a property which will be used momentarily to update the Assembly copyright so it has the current year.
<MSBuild.ExtensionPack.Framework.DateAndTime TaskAction="Get" Format="yyyy">
      <Output TaskParameter="Result" PropertyName="Year"/>
Next the same file version is stamped on all assemblies and the assembly version is only set for the entry client assembly. For a refresher on differences between assembly versions vs. file versions see a post such as this.
<MSBuild.ExtensionPack.FileSystem.File TaskAction="Replace" 
	Files="@(AssemblyInfoFiles)" />
<MSBuild.ExtensionPack.FileSystem.File TaskAction="Replace" 

<!-- special handling for client app project -->
<MSBuild.ExtensionPack.FileSystem.File TaskAction="Replace" 
<MSBuild.ExtensionPack.FileSystem.File TaskAction="Replace" 
Finally "while we're in there" company name and copyright are set:
<MSBuild.ExtensionPack.FileSystem.File TaskAction="Replace" 
	Replacement="AssemblyCopyright("Copyright © Initech $(Year)")"
<MSBuild.ExtensionPack.FileSystem.File TaskAction="Replace" 

What's next?

That's it for the pre-build activities. I may follow this up with posts on build and packaging, deployment scripts, and Team City setup and configuration.

TFS Artifact Manager

The What

TFS Artifact Manager is a tool intended for downloading application "artifacts", such as database scripts and report files, from TFS. It pulls down these files from task source control changesets and attachments, organizes them, outputs task and change information, and helps combine and package database object changes. The tool's primary value is doing this in mass for a work item such as a Project work item, where it will enumerate all linked scenarios, issues, bugs, and tasks, providing the complete set of source code changes that need to be deployed for a given software release.

The Why

The .net application code side of our release management process is fairly automated and painless, with the exception of merging some code in TFS such as certain XML-based files such as project files. The same has not been true with other project items such as reports and especially database changes.

Our continuous integration process does automatically deploy report changes on checkins in certain environments, and it pulls down and combines source controlled database changes. However that only solves part of the problem and managing these artifacts has historically been painful to say the least.

The History

This tool was created by the great Bryant Brabson to help pull together these changed artifacts for a given software release. Upon his sad departure I became the next victim of managing the release engineering for this large operational app. I took what was started and made a number of large additions and changes where I could find the time between deployments. While this utility is somewhat technology and process specific, it is not company-specific per se so I decided to release the source. However some changes would likely be needed to adapt the tool to different processes.

The Setup

Some details on the setup for the app I have been using this tool with are below. The tool does not really care what database or reporting system is in use; it is simply pulling changeset files and attachments where the file extensions match what is configured for the associated work items.
  • Source Control, Work Item Tracking: TFS 2008 with MSF Agile process template
    • Not tested against TFS Server 2010 yet but probably works
    • Older, customized version of MSF Agile process template
    • Some customizations likely needed depending on process template details (related work items etc.)
  • Database: Oracle
    • Database file extensions are configurable
    • Database Packager view allows selection of Oracle/SQL Server for previewing SQL (syntax highlighting)
  • Database Source Control: Toad for Oracle using TFS MSSCCI provider
    • Hopefully native use of Toad's TFS provider soon
  • Reporting: Sql Server Reporting Services
    • Report file extensions are configurable

Configuring the Tool

Some one-time configuration is available in File--> settings, namely the TFS server value.

In the app directory there are files ReportFileTypes.xml and DatabaseFileTypes.xml that specify the file extensions that help categorize and organize the artifact changes. These are also content files in the main source code project and get copied into the bin folder when changed.
<?xml version="1.0" encoding="utf-8"?>
<FileTypeInfo TypeName="Database" FileType="Database">
    <FileExtension Category="Packages" Extension="PKB" Name="PACKAGE BODY" />
    <FileExtension Category="Packages" Extension="PKS" Name="PACKAGE" />
    <FileExtension Category="Functions" Extension="FNC" Name="FUNCTION" />
    <FileExtension Category="Procedures" Extension="PRC" Name="PROCEDURE" />
    <FileExtension Category="Triggers" Extension="TRG" Name="TRIGGER" />
    <FileExtension Category="Views" Extension="VW" Name="VIEW" />
    <FileExtension Category="SQL" Extension="SQL" Name="SQL" />

Running Mass Download

Work Item Selection
A mass download is started by keying in a valid TFS work item id. It is intended to be a project work item but any work item suffices. The tool will validate the work item id and retrieve the title but adding a dialog to select a work item via a team or personal query or similar would be a nice enhancement.

Source Control Exclusions (Artifact Path Filtering)
Next a set of source control exclusions may be entered to exclude source code paths that do not have artifacts, such as branches containing .net application code. The tool could be changed to instead only supply paths where there are artifacts, or to ignore or include based on file extension but this fit our needs. The only paths that might need entering are ones that have changesets linked to the root work item through related work items. It will not hurt if some code paths are included; the changeset files will just get dumped into an Other folder. Currently the exclusions field is a simple newline-delimited field and I have not yet added the ability to browse and pick source control paths.


A Fetch button initiates the processing and processing time depends on the size of the release, determined by the number of linked work items, the number of changesets in the associated tasks, source control exclusions and file type configurations. For a maintenance release of around 80 tasks consisting of 500+ changesets, processing time was usually just under 2 minutes.

The link relationship between work items that the tool currently expects is:
Root Work Item (i.e. Project) => Scenario or Issue or Bug => Task
The root work item does not have to be a project work item type but the rest is fixed. A predicate specifies the condition for the level between the root work item and the tasks. Small changes could be made if there are additional levels between the root work item and the tasks (product backlog items for example).

Mass Download Output

When the mass download completes, it displays a log of the processing details and opens the folder it created which by default is relative to the exe location.


Files are organized into the database folder based on matches with DatabaseFileTypes.xml. The schema/database is determined via everything up to the first "." in the filename. Within the schema folders are subfolders for each of the database object type changes (views, functions, etc.), according to the configuration and the found changesets. In each of these folders reside the files of the DB changes which are current to the last (most recent) changeset in discovered tasks which are not closed. Changes found in tasks not Closed are moved into an _Incomplete subfolder per object type. Likewise deleted objects are moved into _Deleted subfolders.

For each of the DB object type folders, a combined script is created (i.e. SCHEMA_A.Views (10).sql) when more than one change was found. Each database script file is also tagged with information on the last changeset made, with some exceptions.

In the root database output folder the app produces a DatabaseChanges.xml metadata file with detailed information on each database change. This is mostly for use by the Database Packager view in the app, allowing packaging the database changes at a later date than when the mass download was performed. More on the Packager view shortly.
<?xml version="1.0" encoding="utf-8"?>
    <RootDatabaseFolder>C:\temp\TFSArtifactManager\2011\08\WI 21827\16 Tue 13-28-53\Database</RootDatabaseFolder>
  <Changes Count="153">
    <Included Count="0" />
    <Excluded Count="153">      
        <Tasks Count="1">
          <Task Id="21939" AssignedTo="Geoffrey Hudik" State="Closed" Title="Global text change for LRMO" />
	  <!-- remaining changes removed for brevity -->

Attachments are downloaded into an Attachments folder in the output path with a folder per TFS task with attachments. Task attachments with a .sql file attachment are also copied into an _Attachments folder under the Database folder; these are sql files that may need to be run for deployment but are not source-controlled changes (data migratation scripts etc.).

Reports are simply all pulled down into a single folder according to any changesets with file changes matching ReportFileTypes.xml. Incomplete and deleted changes are organized the same as database changes are.

Task Info Files
In the Tasks subfolder various text files are created with information on tasks for the release as well as artifact changes by task. Below is a partial sample from one such file:
32 Closed Tasks with Artifacts
267 Artifact Changes

Global text change for LRMO
TFS #21939 [Closed] by Geoffrey Hudik
	[Task Attachment]

	[Task Attachment]



Artifacts by Task

Artifact changes are also output per-task as well. This does create some file duplication but I often found it helpful to have these file changes together by task, especially for subsequent, partial releases. For initial deployments, releasing by task would often mean deploying some of the same objects multiple times.

Mass Download: Database Packager

The Database Packager view provides the following functionality:
  • Viewing and filtering a list of database script files generated
  • Previewing script files in a popup SQL editor
  • Selecting which database script files to include for a deployment. i.e.:
    • choosing a combined views sql file instead of each individually
    • excluding new triggers that may be included in new table scripts
  • Specifying the order in which to run script files
    • Order is often ignored for most PL/SQL objects such as views, packages, functions
    • Invalidated objects can be fixed with recompile after all is deployed
    • Mostly used for schema changes such as table alters or additions, data migration scripts
  • "Packaging" the included script changes to a directory with filenames prefixed numerically to indicate run order

Click image for larger version

This view is accessed initially via the Database Packager button next to the Fetch button once the mass download is complete. It can be accessed later via menu or toolbar and using the Open toolbar button and browsing to DatabaseChanges.xml in the Database output folder.

Use of the tool is pretty straightforward; filter, sort, and inspect changes at the top, select appropriate changes and click Include (Insert). The order the items are included will be the order they are packaged but they can be rearranged individually with Move Up (Ctrl+Up) or Move Down (Ctrl+Down). Drag and drop can also be used to rearrange the Included items though it is a little flaky at the moment. Any SQL attachments to be included must have the schema value set as scripts are packaged per schema. Clicking Package will copy and rename the included items to a publish folder; there is currently a gap of 5 in the numbering prefix in the filenames to allow for potential manual insertions/rearranging later.

All blue links may be clicked to open the file, folder, or TFS item. Filenames may be clicked (or press spacebar on keyboard) to preview selected database file(s) in a sql viewer. The Database Type combobox in the toolbar controls what syntax to use for syntax highlighting (PL/SQL or T-SQL).

Single Work Item Download

Mass download may be the most useful for a major release but often where I work there are various ad-hoc, mini-deployments such as ad-hoc reports or small database changes. Changed artifacts for a single task are retrieved using a work item changeset retriever that can be accessed via a console app (ChangesetRetriever.exe) or GUI via the Work Item command in TFS Artifact Manager:

File output for a single task is simplified over mass download and might look like the below example:
Despite both the mass and single work item download performing similar functions they were developed separately and share very little in terms of code and requirement needs.

Artifacts downloaded, now what?

On the report side we have a powershell script and an exe that works with SR.exe to deploy SSRS changes. On the database side I've started a SQL editor that allows selection of multiple databases to run the SQL against that I use among our 6 Dev and Test environments; Quest Script Runner or Toad's Script Manager sort of work but I found their use painful. For environments beyond Test I hand the packaged scripts to our DBA to be run with whatever voodoo they do. It did not quite make sense to me to link these deployment tools with the TFS Artifact Manager at this time. The database deployment tool has a long way to go before it is ready for release but perhaps someday.

Future Enhancements

One thing that would be nice is supplying a date or other criteria to the mass download piece to only download changes within a certain window. For additional deployments I currently have to either cherry-pick new changes I know about or re-deploy everything (which usually is not feasible). Likewise with the Database Packager there is currently no way to "merge" a package of database scripts from a prior run of the tool with that of a later one.


On the database side of things there are some tools I have started looking into that might eventually replace most use of this tool for database changes. These include:
The primary hold-up at the moment is price. I believe Toad Extensions runs around $4k and the change management pack requires Oracle's enterprise edition which is another ballpark of cost entirely.


Much of the source was written in small breaks in-between deployments in a rush so it could certainly use some refactoring. The code and bits are available at
Page 1 2